Ir directamente al contenido principal

Article 9 min read

8 cognitive biases that affect how you manage your team

Por Page Grossman

Última actualización el September 21, 2021

Managers are expected to make split-second decisions, day in and day out, tracking who is working on what project, discovering whether or not a team member has made enough progress, deciding where to have the office holiday party, what to order for staff lunch next Friday, and everything else in between.

To make all those decisions, managers have to rely on their gut. Or, more likely, their instincts, honed from previous experience. But, what are those instincts? Understanding our cognitive biases can help us avoid making mistakes as managers and as we make decisions.

To better understand how our cognitive biases affect the decision-making process in the workplace, I talked with Jon M. Quigley, an author and an expert in product development, cost improvement, and organizational learning, and Matt Johnson, PhD, a professor of psychology and neuroscience at Hult International Business School in San Francisco. Johnson is also the co-founder of PopNeuro, a neuromarketing blog that helps companies to better understand their customers and improve marketing through the use of neuroscience.

Here are some cognitive biases they highlighted and how they can impact your team and projects.

Optimism bias

I’ll start it off with a classic cognitive bias that most of us are already familiar with. We all know there are those who look at the world as glass half full and those that see it as glass half empty. The optimism bias is all about seeing the world as slightly more rosy than it is. In the workplace, it can cause us to overestimate our own success in comparison to others.

The reason we’re starting with optimism bias is that, according to Quigley, becoming aware of your optimism bias can help you to look at the world more critically and realistically, which can help you to avoid many of the other biases on this list.

Workplace example
You’re in a meeting discussing the marketing plan for the following year. The team comes up with some exciting, cutting-edge ideas that will use up the entire year’s budget by the end of Q2. But, if successful, the ideas will pay for themselves, with more to spare—consider it an investment with almost guaranteed rewards. Everyone leaves the meeting feeling excited and ready to get started.

How to avoid it
The easiest way to avoid optimism bias is to have a balanced, diverse team. While optimists might think of naysayers as pessimists, when combined, they form a reality-based team that will weigh all the options and make fewer risky decisions. No one likes thinking about what bad things may happen, but it’s important for a team to consider all outcomes before choosing a plan of action so that the team doesn’t take risks that won’t pay off.

Sunk cost fallacy

The sunk cost fallacy is our tendency to consider what we’ve already invested into a project as valuable information and part of our assessment for whether or not we should continue.

Workplace example
Your team has been building a new app that’s due to be released in a few months. You ask for feedback from multiple test groups who all give you feedback that the app is poorly designed, not useful or valuable, and difficult to use. What do you do: re-do the app in a hurry to meet the release deadline or go back to the team and scrap the project?

How to avoid it
You can avoid the sunk cost fallacy by rationally assessing whether or not a project is successful. Remember that the time, money, and effort that’s already been invested was the cost of learning from a failure. Decide your next steps without taking that into consideration.

IKEA effect

While the IKEA effect might seem chuckle-worthy and nothing more, it’s a real bias and can get in the way of your work. The IKEA effect is our bias towards something we’ve built ourselves. Just like that IKEA dresser from your college dorm room you can’t part with, your sentimentality is getting in your way.

Workplace example
You’ve got a favorite coffee cup. It’s perfectly shaped for your hand, it’s got a pithy phrase printed on the front, and was given to you by your best friend. A co-worker borrows it off the office kitchen drying rack and subsequently breaks it. You’re never speaking to them again.

How to avoid it
While this might be a silly example, the IKEA effect can also come into play for anyone who feels personally invested in something. If you hear someone calling a project “their baby,” be prepared for your criticisms, constructive or not, to not go over well.

A great strategy for counteracting the IKEA effect, according to Johnson, is to invite team members who weren’t involved in the creation of a product in to evaluate it. When a project is going through it’s final adjustments before launch, you need to bring in people who are less emotionally invested to evaluate the data with clarity. If this type of feedback is encouraged from the management team for all projects, it creates a bias-fighting environment for the entire company.

[Related read: Successfully build and manage a virtual customer service team]

Anchoring effect

According to Quigley, “our brain tries to reduce cognitive load and take the shortest route to an answer.” The shortest route doesn’t mean we’re getting to the right answer. The anchoring effect is our tendency to rank the first information we hear as more important or relevant than subsequent information. The anchoring effect places blinders on our brain for assessing information we receive second as equally important.

Workplace example
As team manager, you’re holding a brainstorm session for a solution to a bug in your product. Each team member has brought a solution to present. You ask for volunteers to share. After a bunch of people share, you pick the solution that seems most likely to work and tell them to run with it. That solution just happens to have been presented by the first person who spoke.

How to avoid it
Whether we’re sharing ideas in a brainstorming session or just taking a wild guess at a solution to a problem, the first information we hear biases us against all other information. You can avoid the anchoring effect by encouraging a team dynamic where everyone’s solutions are questioned, even those passed down by knowledgeable managers and executives.

[Related read: Innovation requires everyone in the band]

Availability heuristic

Similar to the anchoring effect, the availability heuristic causes us to place a higher value on the ideas that come to mind first. In other words, we’re likely to “trust our gut” and go with the first, instinctual solution we come up with.

Workplace example
As the project manager for a team, someone stops you and asks you to offer a solution to the problem they’re facing while you’re walking down the hall to refill your coffee. They give you a brief rundown and you take a wild guess at a potential solution. Two months later, you’re sitting in a meeting and hear they’ve enacted your wild guess solution to the tune of $50,000.

How to avoid it
While your guess might have been the right choice, it was just a guess. Even with prior experience and knowledge, our gut instinct is just the first thought that comes to mind. According to Johnson, with enough experience, this gut instinct can become more reliable, but we should still always question our instinctual responses and compare it with the data at hand to make sure we’re correct.

Hindsight bias

Humans love to be right. I mean, who doesn’t want to feel the success of having predicted that an event was going to happen?

A classic, well-known experiment was conducted with college students around the confirmation of Clarence Thomas to the U.S. Supreme Court. Prior to the confirmation, students were asked whether or not they thought he would be confirmed. 58 percent responded that he would be.

Here comes the hindsight bias: After the confirmation, students were asked whether or not they had believed, prior to the confirmation, that he would be. 78 percent responded that they had believed he would be confirmed. The disparity between those two statistics is the hindsight bias.

Workplace example
You and your team have completed a massive project and the data is in: it was a complete success. Yay! Now, you have to present the post-mortem to the doom-and-gloom boss who was skeptical of this project from the beginning. As soon as he sees the numbers, he exclaims, “I always knew it would be a success!”

How to avoid it
While the hindsight bias may not harm a project, a manager who is always right, even when they’re not, can drive a team crazy. The way to avoid hindsight bias is to focus not on the ego, but the successes and learning opportunities. Quigley recommends we go back to the facts and hone in on what worked and didn’t work. In the end, while it might hurt our ego, it doesn’t matter who was right or wrong in the beginning.

Dunning-Kruger effect

In one of the many great ironies of life, we all suffer from the Dunning-Kruger effect. This is the tendency for experts to underestimate their abilities and those who are unskilled at a task to overestimate their abilities. The irony: we have no clue on which side of the effect we fall at any given time.

Workplace example
Many managers are promoted because they’re good at their job, but this doesn’t mean they’re good managers. And yet, a promotion is a signal that you’re qualified, right? Not really. If you’ve been promoted to manager, seek out mentors and resources to make sure you’re doing everything you can to deserve the title.

How to avoid it
Johnson reminds me that, generally, intelligence in a topic scales with introspective ability. Meaning: the better you get at something, the more you’re able to be introspective about your ability. While there’s no fool-proof way to avoid the Dunning-Kruger effect (seriously—you don’t know what you don’t know), you can strive to always continue learning and swallow your ego to listen to constructive feedback from trusted sources.

Bias blind spot

As a small reminder of the fallibility of humans and our endless belief in ourselves, we end with the bias blind spot.

Emily Pronin, a social psychologist at Princeton University conducted a study on biases. She tested subjects for biases and then presented them with the data that they had acted upon these biases. She explained each bias and its effects. The researchers then asked subjects how the bias affected their judgement. All of the subjects rated themselves as less susceptible to bias than others.

So, if you’ve been reading through this thinking these biases don’t apply to you, that’s your bias blind spot talking.

Relatos relacionados

Article
4 min read

Smart AI strategies for HR and IT service teams

AI can improve the employee experience in many ways, but strategic decision-making is key when navigating the evolving landscape of employee service and the future of work.

Article
3 min read

4 ways CX leaders are preparing for the future

As a new era of CX begins, here are the top four priorities for CX leaders for the next three years.

Article
4 min read

The humanizing power of AI in CX

When implemented correctly, AI can help businesses create more personal and authentic connections with customers.

Article

Top 6 reasons to attend Zendesk Relate 2024

The future of CX is here, and this is your invitation to think bigger. Here are the top six reasons why Zendesk Relate—coming to Las Vegas April 16 through 18—is the must-attend CX event of the year.