Our brains are energy hungry and responsible for processing huge amounts of data. As such, they are constantly looking for shortcuts. They seek heuristics and abstractions to speed up decision making and reduce the amount of energy spent thinking.
Most of the time these heuristics are useful and the reason we’re able to get on with our daily lives. But when they break down, they lead to irrationality, a distorted perspective on reality, and inaccurate judgement.
Even the smartest people are susceptible to faulty thinking.
There are dozens of well-known mental shortcuts that regularly trip us up; these are collectively referred to as cognitive biases. In the workplace, they affect how we interact with colleagues, how we make decisions, and who we reward and recognize.
In this article, we’ll take a look at eleven cognitive biases that affect team coordination and culture in the workplace every day.
How cognitive bias hinders collaboration
The Dunning-Kruger Effect is a cognitive bias which causes people to believe they are much more competent than they really are, especially in domains where they have little or no actual skill.
This pervades the workplace in a number of ways. The most obvious is when an under-performer is really convinced they are exceptional at what they do.
A more insidious expression of this bias is during cross-functional collaboration when members of one discipline perceive themselves as having equal or higher skill to someone in a specialized discipline. In Silicon Valley, where engineering is dominant, I’ve seen this occur when leaders underestimate the experience of marketers, user researchers, or business professionals.
Fundamental Attribution Error
Also known as Actor-Observer bias, this is your tendency to attribute your own actions to external causes while overemphasizing personality-based explanations for behaviors observed in others, instead of situational factors.
This can lead to unfair judgements of other people’s performance and motivations. When our colleagues err, miss deadlines, or are impolite, we see this as evidence of a character weakness. Whereas, when we commit an error, miss a deadline, or are slightly rude to someone, we defend ourselves as being overworked, stressed, or tired.
People tend to give preferential treatment to those who they perceive to be members of their own groups. In the workplace, this leads to inequity and unfairness at the individual level, and can sometimes lead to toxic silos at the group level.
Groups often form along functional boundaries, which can lead to a dynamic where product feels pitted against engineering, or sales and marketing feel at odds.
Reorienting cross-functional teams around a shared purpose or missions can help break down these walls. You can also look for ways to make your whole organization feel like the ingroup while focusing outgroup energy on an external threat.
Ingroup bias is also one of the reasons why building a diverse team alone isn’t enough, and why inclusion is an essential component in cultivating a healthy and high-performing team.
We’ve talked before on this blog about how in-person standups are broken. One of the reasons is that when speaking in turn, you will experience diminished recall of those who spoke before you. This is related to your ability to lay down long-term memories before a performance and isn’t affected by anxiety levels.
Luckily, knowing about this bias, and consciously paying more attention to people speaking before you, can offset the effects.
Why decision making breaks down at work
Anchoring is the tendency to rely too heavily on an initial piece of information when making a decision. Salespeople know this and often use it to their advantage. It is common to offer a high price early on in a sales process that creates a focal point to frame future negotiations.
If you are facilitating a decision, or simply running a meeting, make sure you own the initial framing and set the stage, otherwise you risk being derailed by someone else (intentionally or accidentally) anchoring the group somewhere else.
It turns out people place more value on items they build themselves: you will value an IKEA table you assembled higher than a prebuilt table, regardless of the quality of the end result.
This bias can come up in a detrimental way when evaluating vendors vs. internally developed tools, or when comparing similar work done by two teams. But it can also be used to your advantage by providing customers ways to configure and customize your product.
Commonly known as the “sunk cost fallacy,” this bias causes people to justify increased investment in a decision based on the cumulative prior investment rather than an objective estimate of future value or worth.
This comes up all the time in a work context, whether deciding to continue to chase a prospective customer or finish a feature that turned out to be more complex than originally planned.
It’s a hard bias to overcome, but try to objectively weigh different paths as if you were starting from scratch. If necessary, bring in someone with no skin-in-the-game so they can help assess the situation impartially.
When writing Good To Great, Jim Collins’ team found that a key attribute of successful companies was their ability to stop doing things.
Humans are creatures of habit and generally change adverse. System justification theory refers to the phenomenon where individuals will prefer the status quo, even at the expense of individual and collective self-interest. In other words, they justify the overarching system as serving a greater good that must be correct, even if they are unhappy or underserved.
The consequences of feeling the need to legitimize the status quo are far reaching. It causes us to justify inequality, makes people resistant to change, and reinforces traditional power dynamics.
The effects of cognitive bias on leadership
The theory of Cognitive Dissonance describes a condition of stress caused by conflicting ideas, values, beliefs or practices. Essentially, two or more opposing thoughts cause psychological discomfort.
This happens in the workplace when people are asked to support or execute work which is in conflict with their sense of right and wrong, training, ethics, or personal values.
In simple cases, it may manifest when someone is asked to take on a project without truly understanding why it is important. In more severe cases it may occur when people are forced to participate in performance review or hiring practices that they feel are ineffective or unfair.
Hot-Cold Empathy Gap
We’re all subject to visceral drives such as hunger, thirst, cravings, pain, and strong emotions. These drives affect our decision making and behavior. When responding to such drives we’re in what is called a “hot” state. The Empathy Gap bias reflects the difficulty of people in a “cold” state to understand what it’s like for someone to be in a hot state, and vice versa.
People in a hot state can’t fully understand how their behavior is being affected by their current conditions. They think their short-term needs reflect their general and long-term preferences. People in a cold state find it hard to picture themselves in a hot state and will underestimate the strength of visceral impulses.
This bias limits the use of an object to the way it is traditionally used. For example, you will only use a hammer as a hammer, even if you are in need of a paperweight.
This bias invades the workplace when people become defined by their function or their title. A product manager may bemoan having no designers on their team, while an engineer with a design background sits unutilized. Or an Executive Assistant might be tasked with finding an event planner while being more than capable of the task themselves.
Overcoming cognitive bias begins with mindfulness
All these biases are incredibly common and hard to avoid. Don’t feel too bad if you see yourself succumbing to them. But know that in the worst cases they can be the root cause of significant organizational dysfunctions and can lead to unfair treatment of people, can negatively affect relationships, and limit our ability to be productive.
There are no silver bullets, but being more mindful of your own psychology is a good first step. Teams can work to build a culture of trust, such that when these biases do emerge, there is a safe environment to respond and correct their effects.