Cognitive bias in forensic decision making

In crime series, forensic evidence appears to be objective and error-free, allowing criminals to be caught in the shortest possible time. However, the reliability of forensic analyses is very dependent on the methods of analysis and on human decisions: Are two voices from the same person? Do the fingerprints match? Does the DNA trace found come from the suspect?

When people make a decision, it can be based on a feeling or the mind. Depending on the problem at hand and how important it is to make the right decision, we apply the appropriate mindset. In everyday life, it is often enough to use mental shortcuts so that we don't spend an unnecessary amount of time making our decision and don't overload our brain. In more complex situations, however, we have to make more conscious and strenuous decisions. So these are decisions we make with our mind. Sometimes we make an unconscious mistake in doing so: we think that a decision is based on the mind, but unconsciously we take some kind of shortcut. This is the point at which we make cognitive thinking errors/distortions (also called bias). 

Denken Sie an Placebo-Effekte oder Stereotypen: Wir verlassen uns unbewusst auf unsere Erwartungen an eine Pille oder eine bestimmte Person. Oder die selektive Aufmerksamkeit: Wenn zwei Mannschaften zum Beispiel genau das gleiche Fußballspiel sehen, sehen sie nur die Schwalben des Gegners. Durch menschliche Irrtümer und bereits gefasste Erwartungen kann es passieren, dass wir unsere Aufmerksamkeit vor allem Informationen schenken, die diese bestätigen. Andere, widersprechende Information werden jedoch außer Acht lassen. Dieser unbewusste menschliche Fehler können allen passieren. Auch hochmotivierte, hochgebildete und kompetente Menschen wie forensische Wissenschaftler, Verhaltensexperten, und Richter können sich nicht ohne weiteres vor kognitiven Verzerrungen schützen.

In the selection, collection, analysis, interpretation and inference of forensic evidence, cognitive biases can occur that can influence the process. For example, it can influence how extensive or limited the search for relevant information is, or how the data is evaluated. How we interpret this data when testing our hypotheses and alternatives, and to what extent we draw an objective conclusion from it. 

Irrelevant information can distract us unconsciously, to illustrate this, let's take an experiment: Five fingerprint experts were asked to compare a fingerprint from a bomb and a suspect. In addition, the experts were told that the suspect was abroad during the attack. Thus, it was implied that a match would be rather unlikely. Now comes the catch: it was not the suspect's fingerprint that was sent along, but a fingerprint that the expert had already analysed once before and had classified as a match. The result was astounding: four of the five experts now came to a different conclusion than before because of the false background information. This study thus impressively shows that there is room for interpretation even in forensic evidence. Forensic evidence, such as DNA traces and fingerprints, can only be interpreted objectively if our judgement is not influenced by additional information. Only then a good inter-rater-reliability can be produced: The same expert comes to the same conclusion with the same information.

Also different experts should come to the same conclusion with the same information, called inter-rater-reliability In the USA, they therefore investigated how objective expert witnesses are when they are hired by one of the two parties (prosecution or defence). In this experiment, they hired 108 forensic psychologists and psychiatrists as experts who looked at a suspect's file and assessed the offender using two widely used, well-researched risk assessment tools. The experts were led to believe that they were hired by either the defence or the prosecution, but in reality they all received the same file. Those who believed they were working for the prosecution tended to assign higher risk scores to offenders, while those who believed they were working for the defence tended to assign lower risk scores to the same offenders;. Pretty strong evidence that your reporting may have been falsified without you being aware of it.

Not all decisions end up being influenced by thinking errors, but it is always important to look in all directions to grasp the truth. Experts must always play devil's advocate, keep different scenarios open and falsify hypotheses before forming an opinion. 

References

  • Dror, I. E., Charlton, D., & Péron, A. E. (2006). Contextual information renders experts vulnerable to making erroneous identifications. Forensic Science International, 156(1), 74-78.
  • Murrie, D. C., Boccaccini, M. T., Guarnera, L. A., & Rufino, K. A. (2013). Are forensic experts biased by the side that retained them?. Psychological Science24(10), 1889-1897.