Think Again: The Power of Knowing What You Don't Know Page 37

The space station chief engineer, Chris Hansen, led the eventual investigation into what had gone wrong with Luca’s suit. “The occurrence of minor amounts of water in the helmet was normalized,” Chris told me. In the space station community, the “perception was that drink bags leak, which led to an acceptance that it was a likely explanation without digging deeper into it.”

Luca’s scare wasn’t the first time that NASA’s failure at rethinking had proven disastrous. In 1986, the space shuttle Challenger exploded after a catastrophically shallow analysis of the risk that circular gaskets called O-rings could fail. Although this had been identified as a launch constraint, NASA had a track record of overriding it in prior missions without any problems occurring. On an unusually cold launch day, the O-ring sealing the rocket booster joints ruptured, allowing hot gas to burn through the fuel tank, killing all seven Challenger astronauts.

In 2003, the space shuttle Columbia disintegrated under similar circumstances. After takeoff, the team on the ground noticed that some foam had fallen from the ship, but most of them assumed it wasn’t a major issue since it had happened in past missions without incident. They failed to rethink that assumption and instead started discussing what repairs would be done to the ship to reduce the turnaround time for the next mission. The foam loss was, in fact, a critical issue: the damage it caused to the wing’s leading edge let hot gas leak into the shuttle’s wing upon reentry into the atmosphere. Once again, all seven astronauts lost their lives.

Rethinking is not just an individual skill. It’s a collective capability, and it depends heavily on an organization’s culture. NASA had long been a prime example of a performance culture: excellence of execution was the paramount value. Although NASA accomplished extraordinary things, they soon became victims of overconfidence cycles. As people took pride in their standard operating procedures, gained conviction in their routines, and saw their decisions validated through their results, they missed opportunities for rethinking.

Rethinking is more likely to happen in a learning culture, where growth is the core value and rethinking cycles are routine. In learning cultures, the norm is for people to know what they don’t know, doubt their existing practices, and stay curious about new routines to try out. Evidence shows that in learning cultures, organizations innovate more and make fewer mistakes. After studying and advising change initiatives at NASA and the Gates Foundation, I’ve learned that learning cultures thrive under a particular combination of psychological safety and accountability.


I ERR, THEREFORE I LEARN

Years ago, an engineer turned management professor named Amy Edmondson became interested in preventing medical errors. She went into a hospital and surveyed its staff about the degree of psychological safety they experienced in their teams—could they take risks without the fear of being punished? Then she collected data on the number of medical errors each team made, tracking serious outcomes like potentially fatal doses of the wrong medication. She was surprised to find that the more psychological safety a team felt, the higher its error rates.

It appeared that psychological safety could breed complacency. When trust runs deep in a team, people might not feel the need to question their colleagues or double-check their own work.

But Edmondson soon recognized a major limitation of the data: the errors were all self-reported. To get an unbiased measure of mistakes, she sent a covert observer into the units. When she analyzed those data, the results flipped: psychologically safe teams reported more errors, but they actually made fewer errors. By freely admitting their mistakes, they were then able to learn what had caused them and eliminate them moving forward. In psychologically unsafe teams, people hid their mishaps to avoid penalties, which made it difficult for anyone to diagnose the root causes and prevent future problems. They kept repeating the same mistakes.

Since then, research on psychological safety has flourished. When I was involved in a study at Google to identify the factors that distinguish teams with high performance and well-being, the most important differentiator wasn’t who was on the team or even how meaningful their work was. What mattered most was psychological safety.

Over the past few years, psychological safety has become a buzzword in many workplaces. Although leaders might understand its significance, they often misunderstand exactly what it is and how to create it. Edmondson is quick to point out that psychological safety is not a matter of relaxing standards, making people comfortable, being nice and agreeable, or giving unconditional praise. It’s fostering a climate of respect, trust, and openness in which people can raise concerns and suggestions without fear of reprisal. It’s the foundation of a learning culture.

In performance cultures, the emphasis on results often undermines psychological safety. When we see people get punished for failures and mistakes, we become worried about proving our competence and protecting our careers. We learn to engage in self-limiting behavior, biting our tongues rather than voicing questions and concerns. Sometimes that’s due to power distance: we’re afraid of challenging the big boss at the top. The pressure to conform to authority is real, and those who dare to deviate run the risk of backlash. In performance cultures, we also censor ourselves in the presence of experts who seem to know all the answers—especially if we lack confidence in our own expertise.

A lack of psychological safety was a persistent problem at NASA. Before the Challenger launch, some engineers did raise red flags but were silenced by managers; others were ignored and ended up silencing themselves. After the Columbia launch, an engineer asked for clearer photographs to inspect the damage to the wing, but managers didn’t supply them. In a critical meeting to evaluate the condition of the shuttle after takeoff, the engineer didn’t speak up.

About a month before that Columbia launch, Ellen Ochoa became the deputy director of flight crew operations. In 1993, Ellen had made history by becoming the first Latina in space. Now, the first flight she supported in a management role had ended in tragedy. After breaking the news to the space station crew and consoling the family members of the fallen astronauts, she was determined to figure out how she could personally help to prevent this kind of disaster from ever happening again.

Ellen recognized that at NASA, the performance culture was eroding psychological safety. “People pride themselves on their engineering expertise and excellence,” she told me. “They fear their expertise will be questioned in a way that’s embarrassing to them. It’s that basic fear of looking like a fool, asking questions that people just dismiss, or being told you don’t know what you’re talking about.” To combat that problem and nudge the culture toward learning, she started carrying a 3 × 5 note card in her pocket with questions to ask about every launch and important operational decision. Her list included:


What leads you to that assumption? Why do you think it is correct? What might happen if it’s wrong?

     What are the uncertainties in your analysis?

     I understand the advantages of your recommendation. What are the disadvantages?


A decade later, though, the same lessons about rethinking would have to be relearned in the context of spacewalk suits. As flight controllers first became aware of the droplets of water in Luca Parmitano’s helmet, they made two faulty assumptions: the cause was the drink bag, and the effect was inconsequential. It wasn’t until the second spacewalk, when Luca was in actual danger, that they started to question whether those assumptions were wrong.

When engineer Chris Hansen took over as the manager of the extravehicular activity office, he inaugurated a norm of posing questions like Ellen’s: “All anybody would’ve had to ask is, ‘How do you know the drink bag leaked?’ The answer would’ve been, ‘Because somebody told us.’ That response would’ve set off red flags. It would’ve taken ten minutes to check, but nobody asked. It was the same for Columbia. Boeing came in and said, ‘This foam, we think we know what it did.’ If somebody had asked how they knew, nobody could’ve answered that question.”

How do you know? It’s a question we need to ask more often, both of ourselves and of others. The power lies in its frankness. It’s nonjudgmental—a straightforward expression of doubt and curiosity that doesn’t put people on the defensive. Ellen Ochoa wasn’t afraid to ask that question, but she was an astronaut with a doctorate in engineering, serving in a senior leadership role. For too many people in too many workplaces, the question feels like a bridge too far. Creating psychological safety is easier said than done, so I set out to learn about how leaders can establish it.


SAFE AT HOME GATES


Prev page Next page