top of page
  • Steve Nguyen, PhD

Book Review - Right Kind of Wrong: The Science of Failing Well



RIGHT KIND OF WRONG - PART ONE


In Part One of her book (Right Kind of Wrong: The Science of Failing Well), Dr. Edmondson introduces a framework of failure types. She begins by talking about key concepts in failure. She follows by describing the three failure archetypes: intelligent (Ch. 2), basic (Ch. 3), and complex (Ch. 4).

What Is the Right Kind of Wrong?

"You might think that the right kind of wrong is simply the smallest possible failure. Big failures are bad, and small failures are good. But size is actually not how you will learn to distinguish failures, or how you will assess their value. Good failures are those that bring us valuable new information that simply could not have been gained any other way" (Edmondson, 2023, p. 16).


Dr. Edmondson defines failure as: an outcome that deviates from desired results; "a lack of success" (Edmondson, 2023, p. 17).


Bad Failure, Good Failure


Failure is not always bad [e.g., Edmondson's failure, as a PhD student, to support her research hypothesis guided her first study and "was the best thing that ever happened to [her] research career" (Edmondson, 2023, p. 17)].


Learning from failure is not as easy as it sounds due to our own beliefs about success. "Nonetheless, we can learn how to do it well. If we want to go beyond superficial lessons, we need to jettison a few outdated cultural beliefs and stereotypical notions of success. We need to accept ourselves as fallible human beings and take it from there" (Edmondson, 2023, p. 18).


3 Types of Failures

Basic failures ( = preventable failures, like the 1982 crash of Air Florida Flight 90 in which the pilot and co-pilot mistakenly forgot to set the anti-ice instrument to be on, instead of it usually set to be off) [covered in chapter 3]. "Failing to stop to think that the to-them-unusual wintry conditions called for a departure from their routine—the deicing instruments should have been on—the crew triggered a devastating failure" (Edmondson, 2023, p. 91) that led to the lost of 78 lives.


These are the most easily understood and most preventable. Basic failures are caused by mistakes and slips, and "can be avoided with care and access to relevant knowledge" (Edmondson, 2023, p. 19).


Complex failures ( = "many little things" adding up to a large or small failure, like the 1967 disaster of the Torrey Canyon, Britain's largest oil spill ) [covered in chapter 4]. Complex failures are "the real monsters that loom large in our work, lives, organizations, and societies" (Edmondson, 2023, p. 19). This is because complex failures have multiple causes and often include a bit of bad luck as well. Professor Edmondson explains that complex failures are on the rise due to "the increasingly complex information technology (IT) that underlies every aspect of life and work today" (Edmondson, 2023, p. 142). In addition, the "development of smart systems that communicate independently gave rise to an infinite variety of potential breakdowns. This interdependence is a breeding ground for complex failure" (Edmondson, 2023, p. 143).


Unfortunate breakdowns "will always be with us due to the inherent uncertainty and interdependence we face in our day-to-day lives. This is why catching small problems before they spiral out of control to cause a more substantial complex failure becomes a crucial capability in the modern world" (Edmondson, 2023, p. 19).

Intelligent failures ( = "'good failures' that are necessary for progress" (Edmondson, 2023, p. 19). Think about the small and large discoveries (after many failed attempts) that further our knowledge and practice in medicine, science, and technology. It's important we learn from our intelligent failures rather than fear, deny, or feel bad about them.


According to Dr. Edmondson (in a podcast episode, [Ignatius, 2023]): "A well-run clinical trial on a new cancer drug is an intelligent failure when it turns out it doesn’t have the efficacy that we hoped. It was in new territory. There was no other way to find out but to do a clinical trial. It’s the right size, it’s no bigger than it has to be. It’s hypothesis-driven in pursuit of a goal." For example, Eli Lilly's chemotherapy drug, Alimta, failed to establish "efficacy" in treating patients' cancer. However, the doctor who ran the drug trial wanted to learn as much as possible from the failure. He discovered that some patients did benefit from the drug, and that the ones who failed to benefit had a folic acid deficiency! As a result, he added folic acid supplements to the drug in subsequent clinical trials, leading to significant improvements in efficacy and resulted in the Alimta drug becoming a top seller with sales of almost $2.5 billion a year (Edmondson, 2023).

RIGHT KIND OF WRONG - PART TWO

In Part Two of the book, professor Edmondson presents her "latest thinking on self-awareness, situation awareness, and system awareness—and how these capabilities intersect with the three types of failure" (p. 19). She takes us deeper "into tactics and habits that allow people to practice the science of failing well at work and in their lives" (Edmondson, 2023, p. 19).

In Chapter 5, she takes us on an exploration of self-awareness and its key role in the science of failure. She writes that "our human capacity for sustained self-reflection, humility, honesty, and curiosity propels us to seek out patterns that provide insight into our behavior" (Edmondson, 2023, pp. 19-20). There's a helpful "Table 5.1: Cognitive Habits for Responding to Failures" on p. 194.

In Chapter 6, she takes readers into situation awareness—and learning how to read a given situation for its failure potential. We get "a sense of what situations present an accident waiting to happen so as to help prevent unnecessary failure" (Edmondson, 2023, p. 20). There's a really handy "Figure 6.2: The Failure Landscape" on p. 223.

In Chapter 7, she talks about system awareness. She writes: "We live in a world of complex systems where our actions trigger unintended consequences. But learning to see and appreciate systems—say, family, organization, nature, or politics—helps us prevent a lot of failures" (Edmondson, 2023, p. 20).

In Chapter 8, Dr. Edmondson pulls it all together to help readers answer the question of "how to thrive as a fallible human being" (p. 20). As humans, we are all fallible. "The question is whether, and how, we use this fact to craft a fulfilling life full of never-ending learning" (Edmondson, 2023, p. 20).

When Dr. Edmondson was doing her research study in the early 1990's (as a PhD student) on whether better teamwork led to fewer errors in the hospital, the data she collected suggested that better teams had higher, not lower, error rates. This was the opposite of what she had predicted (Edmondson, 2023).

"Most of us feel ashamed of our failures. We're more likely to hide them than to learn from them. Just because mistakes happen in organizations doesn't mean learning and improvement follow" (Edmondson, 2023, p. 3). "But most of us fail to learn the valuable lessons failures can offer. We put off the hard work of reflecting on what we did wrong" (Edmondson, 2023, p. 5).

Dr. Edmondson advises us to reframe how we understand failure (e.g., how Olympic bronze medalists view their result as a success [earning a medal] vs. how silver-medaling counterparts view their results as a failure [disappointed at being so close, but not earning gold]) — on both a personal and cultural level — and learn to recognize the crucial distinctions that separate good failure from bad failure.

FAILING WELL IS HARD

"Failing well is hard for three reasons: aversion, confusion, and fear. Aversion refers to an instinctive emotional response to failure. Confusion arises when we lack access to a simple, practical framework for distinguishing failure types. Fear comes from the social stigma of failure" (Edmondson, 2023, p. 25).

Although we rationally know that failure can't be avoided in life, it's still hard to handle. Part of the reason is that, as human beings, we process negative and positive information differently (Edmondson, 2023). We take in bad information "more readily" than we do good information.

"In sum, our aversion to failure, confusion about failure types, and fear of rejection combine to make practicing the science of failing well more difficult than it needs to be" (Edmondson, 2023, p. 40).

BASIC FAILURES

Sometimes, our "basic failures" can turn into incredible (and profitable) opportunities. Take the story of how the famed "Lee Kum Kee" oyster sauce was accidentally invented.

"Lee Kum Sheung, a twenty-six-year-old chef at a small restaurant serving cooked oysters in Guangdong, a coastal province in south China, did not intend to vary the preparation that fateful day in 1888. Lee mistakenly left a pot of oysters to simmer too long, only to come back to a sticky brown mess. Tasting the result, he discovered that it was delicious! It did not take him long to decide to make his "oyster sauce" on purpose, selling it in jars under the Lee Kum Kee brand. Eventually his "brilliant mistake" would make Lee and his heirs extremely wealthy. When Lee's grandson died in 2021, the family was worth more than $17 billion. Even if most basic failures don't yield valuable new products, many of today's favorite foods, including potato chips and chocolate chip cookies, were discovered by accident" (Edmondson, 2023, p. 122).


"Errors will always be with us. Often, they're harmless. Other times they cause basic failures that range from a funny story to tell friends (a dented bumper) to a devastating loss of life (the Kansas City Hyatt Regency Hotel collapse). All of us confront daily opportunities to disrupt the causal chain linking error to failure. What makes basic failure hard to prevent is our instinctive aversion to error, especially our own. But by befriending error so we can catch, report, and correct it, consequential failures can be avoided" (Edmondson, 2023, p. 122).


COMPLEX FAILURES


Dr. Edmondson says we cannot prevent all complex failures because of so many contributing factors that create the perfect storm in creating them. However, there are a few simple strategies (framing, amplify, practicing) we can follow that can help prevent major complex failures.

  • Framing: Explicitly emphasizing the complexity or novelty of a situation.

  • Amplify: Amplify weak or quiet signals; make sure a signal is heard.

  • Practicing: Rehearse and be as prepared as possible to respond to problems when they arise; catching & correcting errors require practice. It's impossible to create contingency plans for every failure. However, "it is possible to build the emotional and behavioral muscles that allow us to respond to human error and unexpected events allike with speed and grace" (Edmondson, 2023, p. 163).

INTELLIGENT FAILURES


On p. 64, in Table 2.1 "How to Tell If a Failure Is Intelligent," Dr. Edmondson helps readers understand when a failure is an "intelligent failure." Ask these questions to see if the failure qualifies as intelligent: Does it take place in a new territory? Does it present a credible opportunity to advance toward a desired goal? Is it informed by prior knowledge? Is it as small as possible? Blanding (2023) wrote a nice, short article covering the four factors that characterize intelligent failure.


AVERSION TO FAILURE


"Numerous studies show that we process negative and positive information differently. You might say we're saddled with a 'negativity bias.' We take in 'bad' information, including small mistakes and failures, more readily than 'good' information. We have more trouble letting go of bad compared to good thoughts. We remember the negative things that happen to us more vividly and for longer than we do the positive ones. We pay more attention to negative than positive feedback. People interpret negative facial expressions more quickly than positive ones. Bad, simply put, is stronger than good. This is not to say we agree with or value it more but rather that we notice it more" (Edmondson, 2023, pp. 26-27).

It's human nature to not want to lose or fail. "The pain of failing . . . is more emotionally salient than the pleasure of succeeding" (Edmondson, 2023, p. 27).

SCIENCE OF FAILING WELL ISN'T FUN, BUT CAN BRING DISCOVERY


"The science of failing well, like any other science, is not always fun. It brings good days and bad. It's practiced by fallible human beings working alone and together. But one thing is certain. It will bring discovery. Discoveries about what works and what doesn't work in achieving the goals that matter to you, along with discoveries about yourself. Elite failure practitioners around the world and throughout history—athletes, inventors, entrepreneurs, scientists-have taught me a great deal about the unique combination of curiosity, rationality, honesty, determination, and passion that failing well requires. Their example nudges and inspires me to try to keep improving my own skills and habits, and I hope it will do the same for you" (Edmondson, 2023, p. 292).

Dr. Edmondson shared about how a study at NASA contributed to improvements in the safety of passenger air travel today.

"A team of researchers at NASA, led by human-factors expert H. Clayton Foushee, ran an experiment to test the effects of fatigue on error rates. They had twenty two-person teams; ten were assigned to the "postduty" or "fatigue" condition. These teams "flew" in the simulator as if it were the last segment of a three-day stint in the short-haul airline operations where they worked. The fatigued teams had already flown three eight- to ten-hour daily shifts. Those shifts included at least five takeoffs and landings, sometimes up to eight. The other ten teams (the "pre-duty," well-rested condition) flew in the simulator after at least two days off duty. For them, the simulator was like their first segment in a three-day shift" (Edmondson, 2023, p. 8).

"To his surprise, Foushee discovered that the teams who'd just logged several days flying together (the fatigued teams) performed better than the well-rested teams. As expected, the fatigued individuals made more errors than their well-rested counterparts, but because they had spent time working together through multiple flights, they'd made fewer errors as teams. Apparently, they were able to work well together, catching and correcting one another's errors throughout the flight, avoiding serious mishaps. The fatigued pilots had essentially turned themselves into good teams after working together for a couple of days. In contrast, the well-rested pilots, unfamiliar with one another, didn't work as well as teams.

"This surprise finding about the importance of teamwork in the cockpit helped fuel a revolution in passenger air travel called crew resource management (CRM), which is partly responsible for the extraordinary safety of passenger air travel today. This impressive work is one of many examples of what I call the science of failing well" (Edmondson, 2023, pp. 8-9).

PSYCHOLOGICAL SAFETY

"Psychological safety plays a powerful role in the science of failing well. It allows people to ask for help when they're in over their heads, which helps eliminate preventable failures. It helps them report — and hence catch and correct — errors to avoid worse outcomes, and it makes it possible to experiment in thoughtful ways to generate new discoveries" (Edmondson, 2023, p. 15).

"[Y]our perception of whether it's safe to speak up at work is unrelated to whether you're an extrovert or an introvert. Instead, it's shaped by how people around you react to things that you and others say and do" (Edmondson, 2023, p. 16).

"When a group is higher in psychological safety, it's likely to be more innovative, do higher-quality work, and enjoy better performance, compared to a group that is low in psychological safety. One of the most important reasons for these different outcomes is that people in psychologically safe teams can admit their mistakes. These are teams where candor is expected. It's not always fun, and certainly it's not always comfortable, to work in such a team because of the difficult conversations you will sometimes experience. Psychological safety in a team is virtually synonymous with a learning environment in a team. Everyone makes mistakes (we are all fallible), but not everyone is in a group where people feel comfortable speaking up about them. And it's hard for teams to learn and perform well without psychological safety" (Edmondson, 2023, p. 16).

ADVICE FOR MASTERING THE SCIENCE OF FAILING WELL

So, what's professor Edmondson's advice for how we can better master the science of failing well? First, she says we all have to make peace with the idea and reality that we're fallible creatures. We make mistakes and will make mistakes. "We need to accept ourselves as fallible human beings and take it from there" (Edmondson, 2023, p. 18).

Second, she urges us to be willing to apologize for our failures (Edmondson, 2023) and forgive ourselves and others for the mistakes and missteps we make (Thoman, 2023). "With fallibility comes failure, and with failure comes an opportunity to apologize" (Edmondson, 2023, p. 280).


Finally, third, she recommends that we become humble & curious. "Failing well, perhaps even living well, requires us to become vigorously humble and curious—a state that does not come naturally to adults" (Edmondson, 2023, p. 169).

"I think there's much more joy and much more adventure, and yes more failure, if you can reinvigorate your own spirit of curiosity and use it to drive you forward. That's the real fuel, I think, in the science of failing well" (Thoman, 2023).

Summary: Right Kind of Wrong: The Science of Failing Well by Amy Edmondson is an exceptional book about failure, learning, and life! Failure is inherent in being human, and, as such, we cannot escape nor can we avoid it. Instead, professor Edmondson has given us an incredibly useful and practical tool to help us learn to adopt a fail well mentality and overcome the barriers that make failing well so hard. The Right Kind of Wrong book teaches us — through memorable stories — about how to start learning from failure, and, above all, to understand that we don't have to fear our failures but can, instead (1) learn to prevent basic "preventable failures," (2) catch small problems before they spiral out of control to cause a substantial complex failure, and (3) embrace the intelligent "good failures" that are necessary for progress. Highly recommended!

Written By: Steve Nguyen, Ph.D.

Organizational & Leadership Development Leader


References

Blanding, M. (2023, September 5). Failing Well: How Your ‘Intelligent Failure’ Unlocks Your Full Potential. https://hbswk.hbs.edu/item/failing-well-1-when-failure-is-intelligent

Edmondson, A. (2023). Right Kind of Wrong: The Science of Failing Well. Atria Books.


Ignatius, A. (Host). (2023, July 28). It's OK to Fail, but You Have to Do It Right. [Video episode]. In The New World of Work. https://hbr.org/2023/07/its-ok-to-fail-but-you-have-to-do-it-right


Thoman, L. (Host). (2023, September 5). The Science of Failure – Right Kind of Wrong with Harvard Business School’s Amy Edmondson (No. 161). [Audio podcast episode]. In 3 Takeaways. https://www.3takeaways.com/episodes/harvard-hbs-amy-edmondson

bottom of page