These 3 Cognitive Biases Will Kill Innovation
If you want to innovate consistently, you need to keep confirmation bias in check
Remember New Coke, Coca-Cola’s new-and-improved Coke recipe launched in the mid-’80s? It seemed like a great idea at first: The new formula tested well among consumers and even had some initial success in the market.
Yet what the marketers at Coca-Cola missed is that many consumers had a strong emotional attachment to the old formula, which ended up creating a huge backlash. What started as a good idea — updating and rebranding a century-old drink — ended up backfiring because it addressed a problem that didn’t exist in the first place and ignored decades of ingrained brand loyalty.
When it comes to strategy and innovation, we always like to think we’ve done our homework and that we’ve based our ideas on only the most pertinent insights, but that’s often not the case. We tend to only see what we want to see and then protect our ideas by ignoring or explaining away facts that don’t fit that pattern. In particular, we need to learn to identify and avoid the three cognitive biases that kill innovation.
1. Availability bias
It’s easy to see where the marketers at Coca-Cola went wrong. They had done extensive market testing and the results came back positive. People consistently preferred the new Coke formula over the old one. The emotional ties that people had to the old formula, however, were harder to identify and track.
Psychologists call these types of errors “availability bias.” We tend to base our judgments on the information that is most easily available — such as market testing — and neglect other factors — such as brand attachment or a resistance to change. Often, the most important factors are the ones that you don’t see and therefore don’t figure into your decision making.
The way to limit availability bias is to push yourself to get uncomfortable facts in front of you. In his book Farsighted, Steven Johnson notes two techniques that can help avoid this problem. The first, called “pre-mortems,” asks you to look ahead to the future and imagine that your project has failed so you can figure out why, thereby avoiding these pitfalls ahead of time. The second, called “red teaming,” asks you to set up an independent group outside of your immediate team to help probe and identify holes in your idea.
Amazon’s innovation process is specifically set up to overcome availability bias. Project managers are required to write six-page memos at the start of every project which include press releases of both positive and negative reactions. Through a series of meetings, other stakeholders do their best to poke holes in the idea. None of this necessarily guarantees success, but Amazon’s track record is exceptionally good.
2. Confirmation bias
Availability bias isn’t the only way we come to believe things that aren’t true. People tend to lock onto the first information they see (called priming), which affects how they see subsequent data (framing). Sometimes, we receive bad or limited information from a seemingly trustworthy initial source and allow it to shape the entire rest of the project.
Once we come to believe something, we often look for information that confirms it and discount contrary evidence. We will also interpret new information differently according to our preexisting beliefs. When presented with a set of relatively ambiguous set of facts, we often interpret them as naturally supporting our position.
This dynamic plays out in groups as well. We tend to want to form an easy consensus with those around us, because dissent and conflict are uncomfortable. In one study that asked participants to solve a murder mystery, the more diverse teams came up with better answers, but reported doubt and discomfort. The more homogenous teams performed worse, but were more confident.
Imagine yourself sitting in a New Coke planning meeting. How much courage would it have taken to challenge the consensus view? How much confidence would you have in your dissent? What repercussions would you be willing to risk? We’d all like to think that we’d speak up, but would we?
3. The Semmelweis effect
In 1847, a young doctor named Ignaz Semmelweis had a major breakthrough. Working in a maternity ward, he discovered that a regime of hand washing could dramatically lower the incidence of childbed fever. Unfortunately, instead of being lauded for his accomplishment, he was castigated and considered a quack. The germ theory of disease didn’t take hold until decades later.
The phenomenon is now known as the “Semmelweis effect” — a tendency for professionals in a particular field to reject new knowledge that contradicts established beliefs. The Semmelweis effect is, essentially, confirmation bias on a massive scale. It is simply very hard for people to discard ideas that they feel have served them well thus far.
However, look deeper into the Semmelweis story and you will find a second problem that was just as damaging. When the young doctor found that his discovery met some initial resistance, he railed against the establishment instead of collecting more evidence and formatting and communicating his data more clearly. Compare that to the story of Jim Allison, who discovered cancer immunotherapy: At first, pharmaceutical companies refused to invest in Allison’s idea. Yet unlike Semmelweis, he kept working to gather more data and convince others that his idea could work. Unlike Semmelweis, Allison ended up winning over the establishment and winning the Nobel Prize in medicine.
We all have a tendency to reject those who reject our ideas. Truly great innovators like Jim Allison, however, just look at that as another problem to solve.
Don’t believe everything you think
When I’m in the late stages of writing a book, I always start sending out sections to be fact checked by experts and others who have first-person knowledge of events. In some cases, these are people I have interviewed extensively but, in others, sending out the fact checks is my first contact with them.
I’m always amazed how generous people are with their time, willing, in some cases, to go through material thoroughly just to help me get the story straight. Nevertheless, whenever something comes back wrong, I always feel defensive. I know I shouldn’t, but I do. When told that I’m wrong, I just have the urge to push back.
But I don’t. I fight that urge because I know how dangerous it is to believe everything you think, which is why I go to so much effort to send out the fact checks in the first place. That’s why, instead of publishing work that’s riddled with errors and misinterpretations, my books have held up even after being read thousands of times. I’d rather feel initially embarrassed in the safety of my office than in the public arena.
The truth is that our most fervently held beliefs can still be wrong. That’s why we need to make the effort to overcome these natural biases. Whether that is through a formal process like pre-mortems and red teams, or simply seeking out a fresh pair of eyes, we need to avoid believing everything we think. That’s much easier said than done, but if you want to innovate consistently, it will require keeping your cognitive biases in check.