The Best Laid Plans: How Cognitive Biases Impact Leadership Decision-Making
As leaders, we are required to do a great deal of problem solving, planning and decision-making about key areas such as performance management, cost forecasting, stakeholder management, safe operations, etc.
We can tend to think our planning and decision-making activities are conscious acts, involving deep thinking and conscious analysis. While this is true some of the time, a great deal of our thinking takes place at an unconscious level. Being aware of this "autopilot" tendency, as well as understanding what we can do about our brain's tendency to switch to unconscious processing is essential to objective decision-making.

The "Lazy" Brain
Our brains represent about 2% of our body weight yet use around 20% of our energy.
Hence, our brains seek to conserve energy through automating movement and cognitive processes (including our thinking).
-
The average human brain has around 86 billion neurons.
-
About half of these neurons are in the cerebellum, located at the very base of the brain.
-
The cerebellum helps us acquire new skills and make them automatic.
Main takeaway: This automation extends to decision-making and planning, meaning much of our thinking happens unconsciously.
Type 1 and Type 2 Thinking
Kahneman (2011) divides our thinking into two subsystems: type 1 and type 2.
Type 1 thinking is fast, intuitive, unconscious thought. Most everyday activities (like driving, talking, cleaning, etc.) make heavy use of type 1.
The type 2 system is slow, calculating, conscious thought. When faced with a difficult math problem or thinking carefully about a philosophical problem, you're engaging the type 2 system. From Kahneman's perspective, the big difference between type 1 and type 2 thinking is that type 1 is fast and easy but very susceptible to bias, whereas type 2 is slow and requires conscious effort but is much more resistant to cognitive biases.
Traditionally, intelligence has correlated with type 2 thinking. So, it would be reasonable to assume that people who are better at type 2 thinking would use it more and therefore be less vulnerable to bias. However, research shows that even those who are very good at type 2 thinking are even more vulnerable to cognitive biases.
This is a deeply counter-intuitive result. Why is it that people who have a greater capacity to overcome bias have a greater vulnerability to bias? A number of theories have been put forward to explain this result.
One relates to overconfidence. If you've become accustomed to thinking of yourself as being better at avoiding cognitive bias, you come to be confident in your abilities, to the point where you (ironically and unconsciously) think of yourself as less susceptible to biases.
Too often we become over-confident in how our minds think. We believe we see reality perfectly, and there's no way our minds can ever be wrong or misjudge a person or situation. But this isn't the case, and we need to accept these imperfections if we want to make an honest attempt to improve our objective decision-making processes.
What are Cognitive Biases?
A cognitive bias is a systematic error in thinking that affects the decisions and judgements that people make. They are often referred to in psychology as heuristics (cognitive shortcuts) usually as a result of type 1 thinking. Some of these biases are quite generalised energy-saving heuristics, while others refer to quite specific areas of unconscious processing.
Some Examples of Generalised Cognitive Biases
- Black and white thinking
- Catastrophising
- Mind reading
- Overgeneralising
- Filtering
All these biases assist the brain to make quick (type 1) decisions, however, they can lead to major errors in critical thinking. There are many examples of specific cognitive biases, in fact there are around 100 such biases that have been consistently shown to impact our decision-making, some more potently than others.
Biases Identified That Consistently Affect Decisions
The following biases (in particular) have been identified as consistent, powerful and problematic:
- Confirmation bias
- Planning fallacy
- Anchoring bias
- Fundamental attribution error
What is Confirmation Bias?
Confirmation bias happens when you look for information that supports your existing beliefs and reject data that go against what you believe. This can lead you to make biased decisions, because you don't factor in all of the relevant information.
A 2013 study found that confirmation bias could affect the way that people view statistics. Its authors report that people have a tendency to infer information from statistics that support their existing beliefs, even when the data support an opposing view. That makes confirmation bias a potentially serious problem to overcome when you need to make an objective decision.
Confirmation bias is:
- A common and insidious problem that can prevent us from making accurate judgements and decisions in both personal and professional contexts.
- Hardwired in the brain, making it difficult to see and resist.
- Far easier to spot in others than in ourselves.
How to Reduce Confirmation Bias in Your Decisions
- Challenge what you think you see.
- Seek out information from a range of sources. Use approaches such as De Bono's "Six Thinking Hats" technique to consider situations from multiple perspectives.
- Discuss your thoughts with others. Surround yourself with a diverse group of people, and don't be afraid to listen to dissenting views.
- Seek out people and information that challenge your opinions, or assign someone on your team to play devil's advocate for major decisions.
The primary defence against confirmation bias is a healthy sense of self-awareness coupled with humility.
When making decisions and judgements, keep the following thoughts in mind:
- Why do I hold my current beliefs?
- What impact would there be on my ego and pride if I were to learn that my views were incorrect?
- Have I genuinely sought out alternative viewpoints?
- Is it possible that I am simply wrong?
- Pretend that you are supporting an alternative viewpoint. Walk through a plausible explanation supporting that perspective.
Having a healthy understanding of confirmation bias can make you a better critical thinker and decision maker. A good starting point is to observe the bias in others, both in the workplace and in your personal life. When you are feeling passionate about an issue or person, stop yourself and run through the bulleted checklist above. See if you can observe yourself falling victim to confirmation bias.
While it can be painful to admit that your beliefs were misguided, it can ultimately result in better decisions and improved relationships.
What is the Planning Fallacy?
The planning fallacy is a phenomenon where we underestimate how much time we need to complete a task because of optimism bias. This occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. Interestingly, the bias only affects predictions about one's own tasks - when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.
The planning fallacy requires that predictions of current tasks completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current task's completion times are more optimistic than the actual time needed to complete the tasks.
In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns.
Ways to Avoid the Planning Fallacy
The good news is that the planning fallacy is really only a problem for our own work. To avoid it:
- Pair people up and use group estimating techniques to prevent unrealistic optimism
- Use past practice to guide future estimates
- Have meetings to go over lessons learned, and make sure that organisational knowledge is properly managed and recorded so it isn't lost
- Use that knowledge to plan similar tasks more accurately in the future
What is Anchoring Bias?
Anchoring bias is a cognitive shortcut that leads people to rely too heavily on an initial piece of information offered (known as the anchor) when making decisions.
During decision-making, anchoring occurs when individuals use this initial piece of information to make subsequent judgements. Those objects near the anchor tend to be assimilated toward it and those further away tend to be displaced in the other direction.
Once the value of this anchor is set, all future negotiations, arguments, estimates (etc.) are discussed in relation to the anchor. This bias occurs when interpreting future information using this anchor.
Example:
The initial price offered for a used car, set either before or at the start of negotiations, sets an arbitrary focal point for all following discussions. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car.
Put simply, this bias involves jumping to conclusions – that is, to base your final judgement on information gained early on in the decision-making process. Think of this as a first impression bias. Once you form an initial picture of a situation, it's hard to see other possibilities.
How to Avoid Anchoring Bias in Decision-Making
Anchoring may happen if you feel under pressure to make a quick decision or when you generally act hastily. So, to avoid it, reflect on your decision-making history, and think about whether you've rushed to judgement in the past. Then, make time to slow down decisions (Type 2 thinking), and be prepared to ask for more time if you feel pressured to make a quick decision.
If someone is aggressively pushing for a decision, this may be a sign that what they're pushing for is not in your best interests.
What is the Fundamental Attribution Error?
The fundamental attribution error is the tendency to blame others when things go wrong, instead of looking objectively at the situation. You may blame or judge someone based on a stereotype or a perceived personality flaw.
Example:
If you're in a car accident, and the other driver is at fault, you're more likely to assume that he or she is a bad driver than you are to consider if other factors (e.g., bad weather) played a role.
Key distinction:
Fundamental attribution error is the opposite of actor-observer bias, in that you tend to place blame on external events.
Example:
If you have a car accident that's your fault, you're more likely to blame the brakes or the wet road than your reaction time.
How to Avoid the Fundamental Attribution Error in Decisions
It's essential to look at situations, and the people involved in them, non-judgmentally. Use empathy to understand why people behave in the ways that they do.
It’s hard to spot psychological bias in ourselves, because it often comes from unconscious (type 1) thinking. For this reason, it can often be unwise to make major decisions on your own.
Kahneman et al. (2011) reflected on this in a Harvard Business Review article, in which they suggest that you should make important decisions as part of a group process.
Overcoming Psychological Bias in Decision-Making
In summary, psychological bias is the tendency to make decisions or take action in an unknowingly irrational way.
To overcome psychological bias:
- Look for ways to introduce objectivity into your decision-making
- Allow more time for decisions
- Use tools that help you assess background information systematically
- Surround yourself with people who will challenge your opinions
- Listen carefully and empathetically to their views even when they tell you something you don't want to hear
References:
- Avoiding Psychological Bias in Decision Making: How to Make Objective Decisions https://www.mindtools.com/pages/article/avoiding-psychological-bias.htm
- Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus and Giroux,
- De Bono, E. (1999). Six thinking hats. Boston: Back Bay Books.
About the Author
Clive Lloyd is an Australian psychologist who assists high-hazard organisations to improve their safety performance through the development of trust and psychological safety and by doing Safety Differently. He is the co-director and principal consultant of GYST, and developer of the acclaimed CareFactor Program.