The fourth and final installment in this winter’s four-part series on how cognitive bias impacts sales and sales organizations.
December 27, 2023
By Rachel Smith
Of all of the cognitive biases we’ve covered, the one I’m about to tell you about seems the most sinister, the most human, the most meta, and the funniest—depending on the darkness of your sense of humor. I’m talking about the bias blind spot.
All along, our intent in covering cognitive biases has been two-fold. One reason is to understand how other people think and use that knowledge to your advantage in sales negotiations. Is your prospect going to balk at your price because she saw the price of your cheaper competitor first (anchoring bias)? How can you fight against people’s preference to stick with what they’ve got (status quo bias)?
The second reason for so many blogs on cognitive biases is so you can understand when you’re falling into one of their traps. Does everyone really agree with you, or are you just blocking out the naysayers (confirmation bias)? Do you truly think this sale will ever close, or have you spent so much time on it already that you can’t admit to yourself that it’s dead (sunk cost fallacy)?
Knowing what biases humans have can help us avoid them and make our best decisions. Except that we don’t. And we can’t. And that’s because of our bias blind spot.
The bias blind spot is our tendency to see ourselves as less susceptible to cognitive biases than others. In short, we are biased about being biased. In a 2015 study published in Management Science, only one person out of 661 said they thought they were more biased than the average person. One.
I’m not trying to say that how biased people are looks like a perfect bell curve but, by definition, half of us have to be more biased than average. Yet none of us think we are. You may think people with higher self-esteem or better decision-making abilities have a higher degree of blind spot bias but these traits, as well as intelligence, cognitive ability, and other general personality traits, have little to do with the size of one’s bias blind spot.
If we don’t think we’re biased, we’re not going to be on the lookout for how cognitive biases could be distorting our decisions. We will be the salespeople confusing our prospects with too much information when we don’t mitigate against the curse of knowledge. Or worse, we’ll be the HR professional who assumes we won’t fall prey to the in-group bias and continue to hire others who think and look like we do. The larger our bias blind spot, the more likely we are to ignore advice from experts and the less likely we are to learn anything from debiasing training.
A 2022 study published in the International Journal of Bilingualism does shed some light on what could counteract the bias blind spot. Researchers found that bilingual individuals possessed a smaller bias blind spot when using their second language. So, if you grew up speaking English and then became fluent in French, and then you were asked (in French) to assess your susceptibility and that of others to a psychological bias, your bias blind spot is smaller than had you been asked the same question in English. Is your French-speaking self more aware of your shortcomings? In your head, are your English-speaking self and your French-speaking self arguing?
I’m not biased
Oh, monsieur, je ne suis pas d’accord
Did you just call me biased?
Tu es aussi partial que n’importe qui d’autre
Do you want to step outside?!
Zut alors
The theory behind this interesting result is based on the concept of metacognition, which is thinking about thinking. Biases are based on heuristics, or shortcuts, our brains use to expend less energy. In his book, Thinking, Fast and Slow, Daniel Kahneman calls this kind of automatic thinking “System 1” thinking. “System 2” thinking, on the other hand, is slower, more methodical and calculating.
Metacognition falls under the category of System 2 thinking. The theory is that using a foreign language activates or accesses an individual’s metacognitive practices. Perhaps since second-language speakers are already using metacognition, they are more likely to stay in that mode when asked about something they would normally answer using lower-level thinking. So, problem solved. We all just need to learn a second language and speak it in any situation in which we think we may be impacted by cognitive bias.
Recognizing and not falling for our cognitive biases is like any other goal you have for yourself—you have to want to do the work. That’s what makes the bias blind spot so sinister. It leads you to believe you don’t need any work. This is why most debiasing interventions don’t work well and don’t work long term, and they work less well on those with a larger bias blind spot
Research presented at the 2022 Design Research Society Conference provides a possible insight from the world of user experience (UX) design. For UX designers, ignoring one’s biases can easily result in less effective design. Oana Bogdescu at Tilburg University in the Netherlands looked at whether implementation intention could serve as a way to diminish the bias blind spot.
Implementation intention is an if/then plan which specifies a situation and the desired behavior that should be performed in that situation. In this case, it was, “If I need to evaluate myself, then I will consider how UX practitioners would evaluate me.” It sounds simple, but it does two things that could help decrease one’s bias blind spot.
First, in thinking about how someone else would evaluate you, it urges you to look at bias in a way that self-evaluation doesn’t. The bias blind spot only applies to us. We understand everyone else has these biases, but we think that we do not. Someone else looking at your work would be looking for biases.
Second, by thinking about how someone else would think through an evaluation, you’re using metacognition. Getting out of System 1 thinking can help us avoid our bias blind spot, or so the theory goes. More research needs to be done, but the short-term results of this preliminary study indicate that this method of debiasing works.
Could implementation intention work in other fields besides UX, such as in sales? Could we use implementation intentions like, “If I need to explain my product to a new prospect, then I will consider how my mother would best understand it” (curse of knowledge), or, “If a sale has not closed in six months, I will look for at least five reasons it may never close” (sunk cost fallacy).
What else could people do to force their brains into metacognitive thinking? Researchers need to get on this to help all of you poor people who don’t know that biases are distorting your decision-making. None of this applies to me, of course. I’m not biased.
Are you interested in learning more about the science behind how people make decisions? Reach out at mastery@maestrogroup.co for information on our live and self-paced training.
Get the Maestro Mastery Blog, straight to your inbox.