The old adage is true: Believe nothing of what you hear and only half of what you see. TV shows like CSI would have the public believe that you can get the licence plate of a speeding car reflected in someone’s sunglasses from 100 feet away. While this might be great as a deterrent for uneducated criminals, as investigators it creates challenges when dealing with client expectations. To many, perception is reality. Undeterred by reason, people will believe what they want to believe. This is the basis of bias, the blind spots of our reality when making judgments about the world around us.
We, as humans, are guided by our instincts… formed beliefs based on upbringing (including where we were brought up, experiences, stereotypes, insecurities and prejudices. We are formed by the good and bad that happens to us during our lifetime that shapes us into who we are as we try to avoid the unpleasant experiences we have encountered along the way.
As investigators, we rely on this “gut instinct” as a safety mechanism to ensure we make it home safely each day. We learn to trust our guy. Still others use it as a defence mechanism, or to justify racist or unpopular opinions or behaviour.
“We don’t see things as they are, we see things as we are.”
-Anais Nin
More so now, is the influence of political polarization, misinformation and disinformation which has become prevalent in today’s media and social media activists to help influence, sway, dissuade, hurt, minimize or support certain narratives.
As investigator’s we must strive to be as objective as possible. Objective information is factual and not influenced by personal feelings or opinions in considering and
representing facts whereas subjective information is based on or influenced by personal opinions, judgment, feelings, tastes, or point of view.
We all assess people and situations around us and make judgments based on our personal experiences, culture and background. To not recognize this is referred to as unconscious bias or implicit bias.
Implicit bias is, as opposed to being neutral, when we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attidues towards people or associate stereotypes without our concious knowledge. To identify implicit bias, the American Academy of Family Physicians uses the following guidelines that can assist other in identifying and correcting any bias issues:
Introspection: Explore and identify your own prejudices by taking implicit association tests or through other means of self-analysis.
Mindfulness: Since you’re more likely to give in to your biases when you’re under pressure, practice ways to reduce stress and increase mindfulness, such as focused breathing.
Perspective-taking: Consider experiences from the point of view of the person being stereotyped. You can do this by reading or watching content that discusses those experiences or directly interacting with people from those groups.
Learn to slow down: Before interacting with people from certain groups, pause and reflect to reduce reflexive actions. Consider positive examples of people from that stereotyped group, such as public figures or personal friends.
Individuation: Evaluate people based on their personal characteristics rather than those affiliated with their group. This could include connecting over shared interests. Check your messaging: As opposed to saying things like “we don’t see color,” use statements that welcome and embrace multiculturalism or other differences.
Institutionalize fairness: Support a culture of diversity and inclusion at the organizational level. This could include using an “equity lens” tool to identify your group’s blind spots or reviewing the images in your office to see if they further or undercut stereotypes.
Take two: Resisting implicit bias is lifelong work. You have to constantly restart the process and look for new ways to improve.
(source: https://www.aafp.org/journals/fpm/blogs/inpractice/entry/implicit_bias.html)
Confirmation bias is where we tend to gravitate towards ideas and concepts that support our beliefs. This is more evident than ever with the covid mandates, vaccination and the political unrest in the US. The four pillars of confirmation bias are:
- Not seeking out objective facts
- interpreting information to support your existing belief
- only remembering details that upload your belief
- ignoring information that challenges your belief
Confirmation bias is a common threat to investigations whereby investigators are so narrow focused – essentially wearing blinders – that they amplify any evidence that supports a suspect’s guilt while ignoring any exculpatory evidence. A great example of this is the investigation, conviction and eventual acquittal of Guy Paul Morin who was charged with kidnapping, sexually assaulting and murdering a 9-year old girl in Ontario, Canada. Law enforcement was so sure that they had the right suspect – he was a neighbour, a loner and a little “weird”- that they ignored evidence that supported his alibi and would have excluded him as a suspect, while ignoring any other possible suspects in the case. Witnesses were also influenced by the police and prosecutorial tunnel vision and their version of events as they were led to believe that Morin was guilty. Morin was eventually released from prison after DNA evidence exonerated him and identified the actual perpetrator who had previously died by suicide. You can read more about the case in the book “Redrum the Innocent” by Kirk Makin.
Another type of bias is cognitive bias; a type of interpretive bias. It is the processing and interpretation of information affecting our decisions and judgments by simplifying information processing. It is how we interpret someone walking down a dark street as being a risk as opposed to that same person walking down the same street during the day. Biases work as ruls of thumb that helps us make sense of the world around us and helps us to make quick decisions.
In my article, “Using the Right Resources”, in the last issue of The Councilor, I mentioned knowing your limitations and working within them. The Dunning-Kruger effect is a type of cognitive bias whereby “people with low ability at a task overestimate their aptitude.”
The halo effect (sometimes called the halo error) is the tendency to form positive impressions of a person, company, brand to positively influence one’s opinion or feelings in other areas. The halo effect is a type of cognitive bias that can prevent someone from accepting a person, product or brand based on unfounded or preconceived beliefs on what is good or bad. The term was first used by Edward Thorndike. Wikipedia provides an example of the halo effect when noticing that a “person is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person is a good person based upon the rules of that individual’s social concept. This constant error in judgment is reflective of the individual’s preferences, prejudices, ideology, aspirations, and social perception.” It is essentially “judging a book by its cover.”
(source: https://en.wikipedia.org/wiki/Halo_effect)
Contextual bias is when people have good intentions but are vulnerable to making incorrect decisions based on influences unrelated to the situation. Objectivity is compromised as these extraneous influences can cause one to subconsciously develop expectations about the outcome. Contextual bias is the most common type of bias in law enforcement. Racial profiling and victim-blaming are both examples of common contextual biases.
Statistical bias results from a skewed method of data collection or interpretation. If, for instance, it was found that most prisoners on death row drank milk, it could be construed that milk is an indicator or cause for criminal behaviour. While this is clearly not the case, looking at data through a certain lens can achieve undesired results. When this is combined with confirmation bias, then incorrect inferences could be drawn and amplified.
Knowing you are prone to bias, as we all are, is the first step to identifying it, removing the blinders and seeing things in a new light.
Originally published by the Council of International Investigators in their magazine, The Councilor, 2022 Q4