Think Again: The Power of Knowing What You Don't Know Page 32

Over a thousand comments poured in, and I was pleasantly surprised that many reacted enthusiastically to the complexified message. Some mentioned that nothing is either/or and that data can help us reexamine even our closely held beliefs. Others were downright hostile. They turned a blind eye to the evidence and insisted that emotional intelligence was the sine qua non of success. It was as if they belonged to an emotional intelligence cult.

From time to time I’ve run into idea cults—groups that stir up a batch of oversimplified intellectual Kool-Aid and recruit followers to serve it widely. They preach the merits of their pet concept and prosecute anyone who calls for nuance or complexity. In the area of health, idea cults defend detox diets and cleanses long after they’ve been exposed as snake oil. In education, there are idea cults around learning styles—the notion that instruction should be tailored to each student’s preference for learning through auditory, visual, or kinesthetic modes. Some teachers are determined to tailor their instruction accordingly despite decades of evidence that although students might enjoy listening, reading, or doing, they don’t actually learn better that way. In psychology, I’ve inadvertently offended members of idea cults when I’ve shared evidence that meditation isn’t the only way to prevent stress or promote mindfulness; that when it comes to reliability and validity, the Myers-Briggs personality tool falls somewhere between a horoscope and a heart monitor; and that being more authentic can sometimes make us less successful. If you find yourself saying ____ is always good or ____ is never bad, you may be a member of an idea cult. Appreciating complexity reminds us that no behavior is always effective and that all cures have unintended consequences.

     xkcd.com


In the moral philosophy of John Rawls, the veil of ignorance asks us to judge the justice of a society by whether we’d join it without knowing our place in it. I think the scientist’s veil of ignorance is to ask whether we’d accept the results of a study based on the methods involved, without knowing what the conclusion will be.


MIXED FEELINGS

In polarized discussions, a common piece of advice is to take the other side’s perspective. In theory, putting ourselves in another person’s shoes enables us to walk in lockstep with them. In practice, though, it’s not that simple.

In a pair of experiments, randomly assigning people to reflect on the intentions and interests of their political opposites made them less receptive to rethinking their own attitudes on health care and universal basic income. Across twenty-five experiments, imagining other people’s perspectives failed to elicit more accurate insights—and occasionally made participants more confident in their own inaccurate judgments. Perspective-taking consistently fails because we’re terrible mind readers. We’re just guessing.

If we don’t understand someone, we can’t have a eureka moment by imagining his perspective. Polls show that Democrats underestimate the number of Republicans who recognize the prevalence of racism and sexism—and Republicans underestimate the number of Democrats who are proud to be Americans and oppose open borders. The greater the distance between us and an adversary, the more likely we are to oversimplify their actual motives and invent explanations that stray far from their reality. What works is not perspective-taking but perspective-seeking: actually talking to people to gain insight into the nuances of their views. That’s what good scientists do: instead of drawing conclusions about people based on minimal clues, they test their hypotheses by striking up conversations.

For a long time, I believed that the best way to make those conversations less polarizing was to leave emotions out of them. If only we could keep our feelings off the table, we’d all be more open to rethinking. Then I read evidence that complicated my thinking.

It turns out that even if we disagree strongly with someone on a social issue, when we discover that she cares deeply about the issue, we trust her more. We might still dislike her, but we see her passion for a principle as a sign of integrity. We reject the belief but grow to respect the person behind it.

It can help to make that respect explicit at the start of a conversation. In one experiment, if an ideological opponent merely began by acknowledging that “I have a lot of respect for people like you who stand by their principles,” people were less likely to see her as an adversary—and showed her more generosity.

When Peter Coleman brings people together in his Difficult Conversations Lab, he plays them the recording of their discussions afterward. What he wants to learn is how they were feeling, moment by moment, as they listen to themselves. After studying over five hundred of these conversations, he found that the unproductive ones feature a more limited set of both positive and negative emotions, as illustrated below in the image on the left. People get trapped in emotional simplicity, with one or two dominant feelings.

As you can see with the duo on the right, the productive conversations cover a much more varied spectrum of emotions. They’re not less emotional—they’re more emotionally complex. At one point, people might be angry about the other person’s views, but by the next minute they’re curious to learn more. Soon they could be shifting into anxiety and then excitement about considering a new perspective. Sometimes they even stumble into the joy of being wrong.

In a productive conversation, people treat their feelings as a rough draft. Like art, emotions are works in progress. It rarely serves us well to frame our first sketch. As we gain perspective, we revise what we feel. Sometimes we even start over from scratch.

What stands in the way of rethinking isn’t the expression of emotion; it’s a restricted range of emotion. So how do we infuse our charged conversations with greater emotional variety—and thereby greater potential for mutual understanding and rethinking?

It helps to remember that we can fall victim to binary bias with emotions, not only with issues. Just as the spectrum of beliefs on charged topics is much more complex than two extremes, our emotions are often more mixed than we realize.* If you come across evidence that you might be wrong about the best path to gun safety, you can simultaneously feel upset by and intrigued with what you’ve learned. If you feel wronged by someone with a different set of beliefs, you can be simultaneously angry about your past interactions and hopeful about a future relationship. If someone says your actions haven’t lived up to your antiracist rhetoric, you can experience both defensiveness (I’m a good person!) and remorse (I could’ve done a lot more).

In the spring of 2020, a Black man named Christian Cooper was bird-watching in Central Park when a white woman walked by with her dog. He respectfully asked her to put the dog on a leash, as the nearby signs required. When she refused, he stayed calm and started filming her on his phone. She responded by informing him that she was going to call the police and “tell them there’s an African American man threatening my life.” She went on to do exactly that with a 911 operator.

When the video of the encounter went viral, the continuum of emotional reactions on social media rightfully spanned from moral outrage to sheer rage. The incident called to mind a painful history of false criminal accusations made against Black men by white women, which often ended with devastating consequences. It was appalling that the woman didn’t leash her dog—and her prejudice.

“I’m not a racist. I did not mean to harm that man in any way,” the woman declared in her public apology. “I think I was just scared.” Her simple explanation overlooks the complex emotions that fueled her actions. She could have stopped to ask why she had been afraid—what views about Black men had led her to feel threatened in a polite conversation? She could have paused to consider why she had felt entitled to lie to the police—what power dynamics had made her feel this was acceptable?

Her simple denial overlooks the complex reality that racism is a function of our actions, not merely our intentions. As historian Ibram X. Kendi writes, “Racist and antiracist are not fixed identities. We can be a racist one minute and an antiracist the next.” Humans, like polarizing issues, rarely come in binaries.

When asked whether he accepted her apology, Christian Cooper refused to make a simple judgment, offering a nuanced assessment:


I think her apology is sincere. I’m not sure if in that apology she recognizes that while she may not be or consider herself a racist, that particular act was definitely racist. . . .

Granted, it was a stressful situation, a sudden situation, maybe a moment of spectacularly poor judgment, but she went there. . . .

Is she a racist? I can’t answer that—only she can answer that . . . going forward with how she conducts herself, and how she chooses to reflect on the situation and examine it.

Prev page Next page