You’re Wrong & Don’t Know It: Process Biases

Chris Russo

Chris Russo

Content Marketing Manager

November 9, 2022

“Constantly questioning our own thinking would be impossibly tedious…much too slow and inefficient….The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high” Daniel Kahneman


The German troops often bombarded London during World War II, with some neighbourhoods being attacked more frequently than others. This had not only the effect of causing some residents to leave the worst-affected districts but, consequently, also caused people to believe that anyone living in the largely unhit areas was an enemy ally. However, soon after the war, British statistician R. D. Clarke examined 537 strikes and discovered that no consistent pattern would support an aim to target one part of the city over another. The bombs that fell on London were dropped entirely at random.

The above is an example of the Clustering Illusion, which occurs when people wrongly find patterns or “clusters” in small samples of data in which no actual pattern exists. For example, have you ever worn a “lucky jersey” while your favourite team was playing? It’s a classic case of confusing correlation for causation. And it is just one of the many ways you’re wrong and don’t know it as we explore Process Biases.  

Process Biases refer to the tendency to process information based on cognitive factors instead of concrete evidence, which can skew the perception of reality, even if all the pertinent and necessary data is present. 

The clustering illusion is a prevalent bias and one to be mindful of; however, it is far from the only one that falls into the category of process biases. So, without wasting any more time, let’s look at a few more and then what you can do to mitigate their impact.

Process Biases

Irrational Escalation of Commitment: This frustrating pattern in human behaviour is when people tend to make present irrational decisions based on past rational ones. This is very similar to the idea of the sunk-cost fallacy and is why people sit to the end of bad movies, read boring books, and finish bad meals – all because “I’ve already paid for it.” However, that assumes the time spent not enjoying that meal, book, or movie couldn’t be better spent doing other, more enjoyable things. 

Hyperbolic Discounting: If you’ve ever continued shopping because it’s free shipping over $100 and your cart is only at $87, then you’re already familiar with hyperbolic discounting. In essence, when people choose instant gratification or a short-term gain over delayed gratification, it can yield a more significant gain. Or, as my kids clearly demonstrate – “I want it NOW!”

Hindsight Bias: According to the APA, hindsight bias is the tendency to overestimate the extent to which we could have foreseen the outcome after an event. Sometimes referred to as creeping determinism or, more playfully, the “knew-it-all-along” phenomenon, this bias is most prevalent outside the stadium whenever the home team loses. However, this bias can quickly lead to hubris if not kept in check. 

Framing Bias: The framing effect occurs when people make choices based on how equivalent information is presented, becoming more or less attractive depending on the frame highlighted. If a positive frame is offered, people tend to avoid danger, but when negative, people tend to seek out risk.

Ostrich Effect: This aptly named bias refers to people’s tendencies to avoid negative or unpleasant information, regardless of how helpful it may be. People who are scared to look at their bank accounts after the holiday shopping is over or delay starting a project with someone they don’t like that is time-sensitive should take note and see if they can catch themselves with their heads in the sand. 

Status Quo Bias: Have you ever heard the expression “Don’t rock the boat?” It’s usually said by people who are pretty content with the way things are and prefer not to change the course lest they become diverted or, worse, lose something. This can be most readily seen in companies as they grow, shying away from new product development efforts to focus on scaling what they have. 

Blind-spot Bias: At this point of the list, you’ve probably identified a few of these biases in your friends, family, and coworkers. Please don’t feel bad; it’s easy to do. However, how many of these biases did you identify in yourself? It’s more challenging to see our own biases when we’re solely focused outwardly, which can have grave consequences, especially when the big decisions are on one person’s shoulders. 

The above is a partial list of process biases, but more than enough to hopefully provide you with a cross-sample to begin recognizing them in your own. 

And then what? Glad you asked…

Mitigation & Avoidance Strategies

The fact remains that bias is part of the human condition and, to a large extent, bound to us all – but that doesn’t mean they can’t be avoided or their impact mitigated. And when it comes to process biases, the critical factor to overcoming them is to examine and reexamine, approaching the problem from as many perspectives and angles as possible. 

What if…?

One of the best ways to effectively mitigate process biases is through a What if Analysis, and with it, you challenge a current assumption. For example, we’re all pretty certain that snow will fall (if it hasn’t already) in a couple of months, depending on where you are in the world. But what if it doesn’t? 

Following this idea, you’d then try and determine what caused the event in question, providing a logical argument as to how it became a reality. Then, working backwards, you outline what must have occurred for that scenario to be true. Once done, carry one with one or two more different scenarios, with the goal of eventually generating a list of indicators or events that could predict or detect the beginning of the corresponding or similar event.

Six Hats

Developed by Dr. Edward de Bono, this method can be used by individuals and groups and is designed to segment conflicting thinking styles. By making the way information is processed clear, it’s easier to acknowledge the advantages and disadvantages of each.  

The process unfolds with members of a group being assigned one of six hats (or sequentially assumed if an individual), each with a distinct role to play. 

When you approach a situation wearing each hat, the goal is to change people’s thinking process altering their typically biased feedback mechanism, either too positive or too negative. The result is not seeing situations in their entirety. But by using this method and separating conflicting drivers within our judgment, we can paint a better picture of situations and our choices, why we make them, and their impact.

You’ll find more on the Six Hats Method and how it can lead to better design here.

Final thoughts

Thinking is hard. It’s easier to let our minds run wild and come to conclusions. It requires less effort, and sometimes, it’s the right decision. However, when we rely too much on our cognitive reflexes to make decisions and not the evidence in front of us, we risk taking a step back from reality, including the people and challenges we’re trying to solve. 

All of this would be fine if we lived in siloed vacuums, where our decisions didn’t have the physical, psychological, economic, sociological, political, professional and familial impacts it does, but that isn’t the case. But knowing this, we can take strategic steps to our approach, utilizing tools and frameworks that not only can help tackle risk early but help us recognize instances when it’s anything other than the data in front of us that’s guiding our decisions. 

And you’ll find a wealth of such strategic steps, tools, techniques and tactics RIGHT HERE in our Product Thinking Playbook.

Subscribe to Our Newsletter

Join the Connected newsletter list to receive curated content that exemplifies our Product thinking approach.

Related Posts