Think Again: The Power of Knowing What You Don't Know Page 11
When students arrived for the debate, they discovered that their sparring partner was not a peer but a law student. What they didn’t know was that the law student was in cahoots with the research team: his task was to spend eighteen minutes launching an aggressive assault on their worldviews. Murray called it a “stressful interpersonal disputation,” having directed the law student to make the participants angry and anxious with a “mode of attack” that was “vehement, sweeping, and personally abusive.” The poor students sweated and shouted as they struggled to defend their ideals.
The pain didn’t stop there. In the weeks that followed, the students were invited back to the lab to discuss the films of their own interactions. They watched themselves grimacing and stringing together incoherent sentences. All in all, they spent about eight hours reliving those humiliating eighteen minutes. A quarter century later, when the participants reflected on the experience, it was clear that many had found it agonizing. Drill described feeling “unabating rage.” Locust recalled his bewilderment, anger, chagrin, and discomfort. “They have deceived me, telling me there was going to be a discussion, when in fact there was an attack,” he wrote. “How could they have done this to me; what is the point of this?”
Other participants had a strikingly different response: they actually seemed to get a kick out of being forced to rethink their beliefs. “Some may have found the experience mildly discomforting, in that their cherished (and in my case, at least, sophomoric) philosophies were challenged in an aggressive manner,” one participant remembers. “But it was hardly an experience that would blight one for a week, let alone a life.” Another described the whole series of events as “highly agreeable.” A third went so far as to call it “fun.”
Ever since I first read about the participants who reacted enthusiastically, I’ve been fascinated by what made them tick. How did they manage to enjoy the experience of having their beliefs eviscerated—and how can the rest of us learn to do the same?
Since the records of the study are still sealed and the vast majority of the participants haven’t revealed their identities, I did the next best thing: I went searching for people like them. I found a Nobel Prize–winning scientist and two of the world’s top election forecasters. They aren’t just comfortable being wrong; they actually seem to be thrilled by it. I think they can teach us something about how to be more graceful and accepting in moments when we discover that our beliefs might not be true. The goal is not to be wrong more often. It’s to recognize that we’re all wrong more often than we’d like to admit, and the more we deny it, the deeper the hole we dig for ourselves.
THE DICTATOR POLICING YOUR THOUGHTS
When our son was five, he was excited to learn that his uncle was expecting a child. My wife and I both predicted a boy, and so did our son. A few weeks later, we found out the baby would be a girl. When we broke the news to our son, he burst into tears. “Why are you crying?” I asked. “Is it because you were hoping your new cousin would be a boy?”
“No!” he shouted, pounding his fists on the floor. “Because we were wrong!”
I explained that being wrong isn’t always a bad thing. It can be a sign that we’ve learned something new—and that discovery itself can be a delight.
This realization didn’t come naturally to me. Growing up, I was determined to be right. In second grade I corrected my teacher for misspelling the word lightning as lightening. When trading baseball cards I would rattle off statistics from recent games as proof that the price guide was valuing players inaccurately. My friends found this annoying and started calling me Mr. Facts. It got so bad that one day my best friend announced that he wouldn’t talk to me until I admitted I was wrong. It was the beginning of my journey to become more accepting of my own fallibility.
In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions. Did you know that the moon might originally have formed inside a vaporous Earth out of magma rain? That a narwhal’s tusk is actually a tooth? When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise (“Really?”) followed by curiosity (“Tell me more!”) and thrill (“Whoa!”). To paraphrase a line attributed to Isaac Asimov, great discoveries often begin not with “Eureka!” but with “That’s funny . . .”
When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.
It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies. They’re all just jealous. You’re really, really, ridiculously good-looking. You’re on the verge of inventing the next Pet Rock. As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.”
Our inner dictator also likes to take charge when our deeply held opinions are threatened. In the Harvard study of attacking students’ worldviews, the participant who had the strongest negative reaction was code-named Lawful. He came from a blue-collar background and was unusually precocious, having started college at sixteen and joined the study at seventeen. One of his beliefs was that technology was harming civilization, and he became hostile when his views were questioned. Lawful went on to become an academic, and when he penned his magnum opus, it was clear that he hadn’t changed his mind. His concerns about technology had only intensified:
The Industrial Revolution and its consequences have been a disaster for the human race. They have greatly increased the life-expectancy of those of us who live in “advanced” countries, but they have destabilized society, have made life unfulfilling, have subjected human beings to indignities . . . to physical suffering as well . . . and have inflicted severe damage on the natural world.
That kind of conviction is a common response to threats. Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor. We become preachers or prosecutors striving to convert or condemn the unenlightened. “Presented with someone else’s argument, we’re quite adept at spotting the weaknesses,” journalist Elizabeth Kolbert writes, but “the positions we’re blind about are our own.”
I find this odd, because we weren’t born with our opinions. Unlike our height or raw intelligence, we have full control over what we believe is true. We choose our views, and we can choose to rethink them any time we want. This should be a familiar task, because we have a lifetime of evidence that we’re wrong on a regular basis. I was sure I’d finish a draft of this chapter by Friday. I was certain the cereal with the toucan on the box was Fruit Loops, but I just noticed the box says Froot Loops. I was sure I put the milk back in the fridge last night, but strangely it’s sitting on the counter this morning.
The inner dictator manages to prevail by activating an overconfidence cycle. First, our wrong opinions are shielded in filter bubbles, where we feel pride when we see only information that supports our convictions. Then our beliefs are sealed in echo chambers, where we hear only from people who intensify and validate them. Although the resulting fortress can appear impenetrable, there’s a growing community of experts who are determined to break through.
ATTACHMENT ISSUES
Not long ago I gave a speech at a conference about my research on givers, takers, and matchers. I was studying whether generous, selfish, or fair people were more productive in jobs like sales and engineering. One of the attendees was Daniel Kahneman, the Nobel Prize–winning psychologist who has spent much of his career demonstrating how flawed our intuitions are. He told me afterward that he was surprised by my finding that givers had higher rates of failure than takers and matchers—but higher rates of success, too.
When you read a study that surprises you, how do you react? Many people would get defensive, searching for flaws in the study’s design or the statistical analysis. Danny did the opposite. His eyes lit up, and a huge grin appeared on his face. “That was wonderful,” he said. “I was wrong.”