Irving Fisher was brilliant. A man no less than Milton Friedman called Fisher “the greatest economist the US ever produced.” His models are still used by modern economists. But, brilliant or not, the rest of us have something in common with Irving Fisher. We can be wrong – really, really wrong. Even with his magnificent legacy, many recall Fisher for his one big blooper: his 1929 prediction that “U.S. stocks have reached a permanently high plateau.”
Irving Fisher isn’t the only brilliant mind that got it wrong. Consider:
- Winston Churchill is often cited as one of the great leaders of the 20th century for his unwavering leadership during World War II. He was right about many things, but not atomic energy. Sir Winston declared in 1939 that atomic energy was unlikely to produce anything more dangerous than “present day” weapons.
- For those of a certain age, the Ed Sullivan Show was American Idol and Dancing With The Stars combined. Everyone tuned in to find the emerging talent they put on the stage every Sunday night. The booking agent that set off many careers was not impressed by one act after their first appearance, stating that this group would not last a year. Today, we’d say The Beatles exceeded his expectations.
The smart people who made these pronouncements could not imagine they were wrong. To be fair, history shows they were right about most things. Yet, like the rest of us, they got a few things terribly wrong. We don’t know exactly why in each case. What we can do is understand the thought patterns that kept them, and keep us, stuck in conclusions that crowd out new possibilities. One powerful pattern is the comfort of current ideas, which produces the drive to defend what we think instead of examining it for accuracy.
The Fortress of the Comfort Zone
Our comfort zones are our mental warm beds on cold mornings; we hate to leave them. They are so reassuring that we strengthen them with thinking patterns that minimize the need to venture out to cold floors of uncertainty. Social scientists name this reluctance cognitive dissonance; the discomfort of holding two or more conflicting ideas. We react to this conflict by reaching resolution one way or the other, and climb back under the warm covers by holding onto our established conclusions.
Our tendency to hold onto decisions that resolve dissonance produces one of the most researched cognitive biases: congeniality or confirmation bias. It means that when we reach conclusions, we select subsequent data that confirms our conclusion and dismiss disconfirming data. In a meta analysis of 21 research studies, William Hart, Dolores Albarracin, et al (2009) found that people are almost two times more likely to select information that supports pre-existing attitudes, beliefs and conclusions. This defense motivation was stronger for reversible vs. non-reversible decisions, and strong for recent decisions.
In other words, Hart, Albarracin, et al noticed the same patterns that you may observe when your extended holiday gatherings turn to controversial topics. Many of us don’t like to be uncertain or revisit decisions. To strengthen the fortress around our comfortable ideas, we debate with selective facts that support our views, even when there is little real risk in changing our minds.
Implications for Innovation
Defense motivation and congeniality bias do more than make our family gatherings more memorable. In Harvard Business Review (December 2011), Paul Leonardi writes about the affect that early prototypes may have on harming creativity. He makes the case that while prototypes bring an abstract concept to life (resolve dissonance), they may also produce “innovation blindness” when defense and improvement of the prototype starts and brainstorming on different solutions stops.
I have no way to know if the smart people who made the famously bad calls listed above fell into this trap, but it’s easy to imagine that defending their current view was much easier than imagining a different future. I am certain that years from now someone will critique the bad calls of our day, perhaps because we could not imagine it differently, either.
Open minds have more opportunities for great moments.
Your first conclusion doesn’t have to be your last conclusion. Don’t underestimate the comfort of resolution after deliberation. It may remain your best option, but be open to other possibilities. Follow the principle of John Maynard Keynes: “When the facts change, I change my mind.”
Befriend a Devil’s Advocate. Have at least one confidant comfortable enough to challenge your assumptions and conclusions. As one of my mentors recently advised: “I only challenge to gain clarity.” The right conclusion can withstand the debate; a weak one will be improved because of it.
Intentionally seek out information opposite of your opinion. Once settled into an opinion, seek out others who disagree. Read editorials that differ from your opinion; listen to interviews of people you dislike. It’s not necessary to change your mind, only to recognize that others have legitimate reasons to see the situation differently. This practice can help you imagine what else you can see differently.
Even brilliant people make bad calls. Some of these will be due to the self-interest of hanging onto the comfort of a conclusion after struggling through dissonance. With open minds and ongoing willingness to question, we can improve our chances of being right in the long run.
William Hart, Dolores Albarracin, et al (2009). Feeling Validated vs. Being Correct: A Meta Analysis of Selective Exposure to Information. Psychological Bulletin, vol 135 (4), p.555-588.
Paul Leonardi (2011). Early Prototypes Can Hurt a Team’s Productivity. Harvard Business Review, December, 2011, p. 28.