1-Page Summary

In Black Box Thinking, Matthew Syed argues that learning from failure drives human progress. It’s how individuals and organizations develop, innovate, and improve. Airlines exemplify this in using black boxes—an in-flight recording system—to collect data, analyze airline crashes, and learn how to improve.

However, we still fail to learn from our mistakes in multiple key areas—from health care to criminal justice—and this endangers lives and resources. Overcoming the cultural and systemic barriers to learning from failure can prevent mistakes and create more effective organizations.

Syed is an award-winning journalist, author, and former Olympian for Great Britain’s table tennis team. His sporting career inspired the focus of his writing, which includes four books on mindset, performance, growth, and creativity. Syed also consults with organizations to build cultures and systems that enable them to learn from failure.

In this guide, we explain Syed’s argument that failure drives learning and explore two types of institutions: Those that learn from failure—including aviation, athletics, and science—and those that don’t—such as public hospitals and the court system. We then discuss how to shift mindsets and systems to turn an organization into a learning-oriented institution.

We Learn Through Failure

Syed argues that we can only improve by learning from our failures. This is because your mistakes reveal what you don’t yet understand, thus showing what you need to learn next. For example, revising an essay always reveals vagueness or unclear logic in the first draft, and those mistakes tell the writer how to improve the essay.

On the flip side, neglecting to learn from mistakes means that you can’t improve. Imagine a gymnast who gets flustered by mistakes instead of assessing how to avoid them—such an athlete would keep repeating the same errors.

On the institutional level, organizations that learn from failures iron out systemic flaws and improve their performance. For example, lean startups prioritize learning from what goes wrong, adapting from customer feedback and product failures. By contrast, organizations that ignore their mistakes will continue to make them, risking stagnation.

(Shortform note: In The Art of Learning, Josh Waitzkin argues that learning from your mistakes is the key to competitive success. When you make an error, note where you went wrong and shore up that weakness by training yourself not to repeat it. He also recommends actively seeking out opportunities to fail badly and repeatedly, because this humbles you and helps you approach learning with a beginner’s mind. In a way, this is what learning organizations do: They embrace that what they do know is less than what they don’t know.)

Two Institutional Styles

In modern society, some of our institutions progress by learning from failure, but certain major institutions neglect this opportunity and hinder progress.

1. Learning-Oriented Institutions: These organizations—including medicine, the airline industry, athletic organizations, and some businesses—use their failures to improve their operations and innovate solutions to tough problems. For example, a successful football team holds post-game retrospectives to review what went well and what didn’t, and they use that insight to improve their game plan.

(Shortform note: In The Fifth Discipline, Peter Senge outlines five disciplines to achieve a “Learning Organization”: 1) Create a shared vision to align leadership and staff, 2) Use systems thinking to understand your organization holistically, 3) Clarify your existing mental models and stay open to adapting them, 4) Create teams that learn together and embrace honest mistakes, and 5) Help employees develop personal mastery by training the subconscious mind to handle complexity.)

2. Failure-Averse Institutions: These organizations neglect to learn from their failure, and thereby miss out on opportunities to improve. Their failure-averse cultures preclude the acknowledgment and assessment of mistakes, and their systems lack mechanisms to process and learn from those mistakes. As examples, Syed highlights health care and the courts.

(Shortform note: In Too Much and Never Enough, Mary Trump, the niece of Donald Trump, characterizes his administration as failure-averse. Through analysis of Trump family history and her own experiences, she argues that because he’s never faced consequences for his mistakes or wrongdoings, he can’t conceive that he could be wrong about anything. Any institution with a leader who’s so apt to cast blame becomes failure-averse, since errors go unlearned from.)

Learning-Oriented Institutions Grow Through Failure

Syed explains that learning-oriented organizations have two things in common: Cultures that promote learning from failure and systems that make use of those mistakes.

Element #1: The Culture of Learning-Oriented Institutions

Syed explains that learning-oriented organizations view failure as an opportunity to grow. Individuals and teams feel comfortable making mistakes, reporting them, and learning from them. This mindset exists on two levels: the individual, and the organizational.

On the individual level, you need the desire to improve—to work hard, fail, and try again—plus the tenacity to persist through numerous mistakes and the humility to admit and learn from them. Think of a professional musician: A cellist spends thousands of hours training her craft precisely by making mistakes, adjusting, and pressing onward.

(Shortform note: In Tribal Leadership, Dave Logan and coauthors argue that in large corporate workplaces, employees lose motivation when they feel like cogs in a machine. To build a strong organizational culture, Logan recommends helping each individual develop confidence in their abilities. To do this, point out his strengths and give him tasks that stretch them just a bit. By completing these tasks, he’ll better develop self-esteem and may seek additional responsibilities.)

At the organizational level, you need an open and fair culture that incentivizes learning from failure. When people feel that they’ll be rewarded for failing upward, they’ll happily do so. The cellist doesn’t fear messing up in practice because professional musicians all understand that mistakes are part of learning. Her culture embraces mistakes; they know skill is built on a mountain of errors.

(Shortform note: Syed’s description of an effective organizational culture evokes the Stage 4 cultural stage in Tribal Leadership. Dave Logan explains that at Stage 4, teams focus on values, vision, and alignment more than top-down management and control. The tribe judges others’ actions according to those values, allowing for experimentation and failure so long as the mistakes are made in pursuit of their shared vision. This encourages productive mistakes—those which further the tribe’s goals.)

Element #2: The Systems of Learning-Oriented Institutions

With a failure-embracing culture in place, you need a system that makes use of failure. Syed argues that this system is an iterative method that he calls black box thinking. Here’s how it works:

1. Take action. The organization acts from its best knowledge and gathers data constantly. Measuring their operations gives analysts data to learn from when failures happen. In aviation, each plane is equipped with two black boxes. One records details of the plane’s positioning, trajectory, and functions (landing gear, lights, and so on), and the second records human data, such as conversation in the cockpit.

2. Analyze failures. When an error occurs, the organization uses their data to look for the cause of the error—reflecting on what went wrong and why. For example, airlines use independent investigators who examine data from the black boxes and analyze the circumstances of the crash. When they know what went wrong, they give that information to the airline.

3. Formulate fixes. Next, the organization designs a practical, scalable solution to the problems identified. Then, Syed says, they test these solutions to iron out any flaws before implementing at scale. After identifying that pilots struggled to focus on long flights, airlines tested simple programs to redistribute the workload between the pilots and co-pilot. Once they’d smoothed out any flaws, they implemented it more broadly.

4. Repeat. Finally, these organizations integrate what they’ve learned into their systems. Each time they do this, they accumulate a better understanding of what doesn’t work and what does. Then, they return to step one and keep learning.

Act Fast and Learn Constantly

In The Obstacle Is The Way, Ryan Holiday outlines a similar method that emphasizes frequent, aggressive action:

Holiday’s methods embody the scrappy, experimental mindset that’s critical to succeed with trial-and-error learning. Without that consistent drive to improve, it’s easy to fall into a rut and stagnate—so aim to develop this mindset alongside these learning methods.

Failure-Averse Organizations Neglect to Learn

In contrast to learning-oriented organizations, failure-averse organizations have cultures and systems that prevent them from learning.

Element #1: The Cultural Barrier

Syed explains that culturally, failure-averse institutions stigmatize failure. They treat error as profoundly negative and harshly judge those who make mistakes. This is because the employees are well-educated professionals—such as doctors, nurses, lawyers, and judges—who expect excellence from themselves and their peers.

As a result, any error calls your professional worth into question on two levels: the personal and the social.

On the personal level, mistakes threaten your self-esteem. If a high-powered surgeon takes the blame for a patient’s death, that failure could crush his self-image as a consummate professional and burden him with guilt.

(Shortform note: In Tribal Leadership, Dave Logan explains that this issue is common in individuals at a Stage 3 culture, where individuals often exhibit self-focused behaviors. Doctors, he says, often exemplify the Stage 3 “shark”—an accomplished, top-of-the-class individual who’s used to pursuing their own success. These individuals use ego-focused language (“I, me, my”), which helps to explain why they cover up their mistakes: When everything revolves around him, the doctor puts patient concerns second.)

On the social level, mistakes damage your reputation. In a culture of expertise, any error calls your competency into question. A large mistake can strip away your credibility overnight— for example, the above doctor might become known for letting that patient die.

(Shortform note: Dave Logan explains in Tribal Leadership that at Stage 3, professionals tend to view colleagues as less competent than them, and they are more interested in controlling the flow of information through one-on-one relationships. Since knowledge is power in a competitive environment, this helps them maintain their image and keep others from knowing too much about them. Unfortunately, in health care, this reputation management can come at the expense of the patient.)

Three Problematic Patterns

Syed explains how these cultures are tough on those who make mistakes, believing that proper punishment enhances performance. But in reality, unjust punishment causes a number of other problems, from scapegoating to cover-ups.

Pattern #1: We scapegoat. When something goes wrong, we instinctively blame someone close to the failure without properly examining the situation. This doesn’t help: Scapegoating oversimplifies the story and exacerbates future problems by creating a climate of fear. Since nobody wants to end up scapegoated, mistakes go unreported, unexamined, and unlearned from.

(Shortform note: Cancel culture, the internet boycotting of individuals deemed problematic by public opinion, shares characteristics with scapegoating. It sometimes removes genuinely harmful individuals from power, but it doesn’t address the systems that put them in power, and the mob mentality can ruin the lives of much less powerful people. To engage in effective discourse, try following Rule Omega: Assume that someone you disagree with has at least a partially valid perspective, and try to empathize and understand before you disagree or fight.)

Pattern #2: We oversimplify the story. Syed explains that when we gloss over the full complexity of mistakes, the “narrative fallacy” is at work. In short, we make sense of complex situations by telling simple, appealing stories. Simple stories help us feel like we understand, but they’re usually wrong. We emphasize the major “story beats,” but we discount the details—thus failing to wrestle with the full complexity of the failure.

(Shortform note: In Antifragile, Nassim Nicholas Taleb explains that thought leaders often engage (knowingly or unknowingly) in “narrative discipline,” or the process of creating stories that sound great and seem to fit the data, but which are false or incomplete. This involves cherry-picking data from statistical studies to fit the narrative, since statistics can seem objective and convincing without contextualization. To avoid this, Taleb suggests running controlled experiments, which can’t be cherry-picked as easily.)

Pattern #3: We commit cognitive distortions. When faced with complex situations, cognitive dissonance—unconscious mental distortion of the facts—often comes into play. As Syed explains, nobody likes to be wrong—it’s a threat to our egos. So to defend our beliefs, our brains often distort information to conform to our beliefs. And the more invested we are, the worse the distortions. We’re all prone to three types of cognitive dissonance:

  1. We reframe the evidence. We accept a piece of evidence but tell a story to conform it to our beliefs.
  2. We invent justifications. We accept the evidence but find a way to excuse or rationalize it.
  3. We ignore the facts outright. Sometimes, cognitive dissonance causes point-blank denial of the facts—especially if they’re too threatening to your sense of self.

(Shortform note: Cognitive dissonance occurs automatically, but we can recognize the following three signs: 1) Feeling discomfort prior to making an important decision may indicate conflicting beliefs, a form of dissonance. 2) Excessive justification or qualification of a decision can indicate that you know it wasn’t quite right. 3) Doing something while hiding it from others, like smoking in a private place, indicates dissonance between your desire to improve and your existing habit.)

Element #2: The Systemic Barrier

Syed explains that unlike learning-oriented organizations, failure-averse organizations don’t gather data, conduct investigations, analyze what went wrong, or implement changes. This poses two problems:

Problem #1: Without feedback, you can’t improve. These organizations lack a feedback loop that enables information to flow from failure to analysis to adjustment to implementation. Since feedback tells you what’s going wrong, an organization can’t improve without it.

Imagine running a marketing campaign but gathering no data about who clicked on your ads. You’d have no clue which demographics they appealed to, and thus no idea of how to better target future campaigns.

(Shortform note: In The Art of Learning, Josh Waitzkin suggests examining the theme of your errors. In short, gather data by noting down the mistakes you repeatedly make. Then, examine your notes and identify any patterns. Look for both the emotional and technical aspects of your error, and work to correct them. For example, a parent who gets impatient with their toddler might work to change their mindset while also studying better childcare techniques. To maintain the feedback loop, continue to note and correct your patterns of error.)

Problem #2: Without feedback, the system stagnates. Syed argues that people can only perform as well as the system enables them to. If an organization’s systems aren’t designed to learn from failures, they won’t get better and will also prevent employees from improving.

For example, if a hospital doesn’t gather information about whether its triage system works, they might be constantly overburdened and have no idea how to improve. Without feedback, the employees are stuck within a limited system.

(Shortform note: In his 3-2-1 newsletter, James Clear explains that we don’t “rise to the level of our goals,” we “fall to the level of our systems.” He defines your personal system as the network of habits you’ve built, and we can extend this to organizations: An organization’s system is the network of habits and habit-enabling tools (like error reporting mechanisms) that all its people engage in. If part of the system is to report errors and openly admit fault, any new employees will default to that behavior, “falling” to the level of the system they’ve stepped into.)

Failure-Averse Institutions Waste Lives and Resources

In the bigger picture, the institutional failure to improve is morally irresponsible, according to Syed. First, a stagnant institution is a missed opportunity to develop better systems that support life and improve our world. Each time a learning opportunity goes unexploited, progress slows.

(Shortform note: Jean-Jacques Rousseau, a 17th-century Enlightenment thinker, argued that if civilization progresses solely through scientific rationality, our morals will degrade. In his Discourse on the Sciences and Arts, Rousseau points to ancient Athens, where the flourishing sciences and arts led to a society focused on leisure, luxury, and decadence. Advanced scientific knowledge, he says, isn’t intrinsically moral, so if we neglect virtue in favor of scientific progress, we’ll end up with a materially advanced but morally bankrupt society.)

Second, mistakes can have life-and-death consequences. For example, consider the health care system: Preventable errors in the US medical system cause hundreds of thousands of deaths annually—making them the US’s third largest cause of death.

(Shortform note: To expand on Syed’s data, The Peterson Center Health System Tracker reveals that, while preventable errors remain problematic, many causes of death have declined: Mortality rates for infant care, respiratory diseases, diseases of the circulatory system, and cancers have all fallen since the 1980s.)

Failing isn’t the problem in and of itself. Rather, the problem is neglecting to learn from those failures. When organizations fail to report and learn from their mistakes, they perpetuate patterns of failure that put lives at risk.

In contrast to courts and hospitals, the airline industry has learned from the hundreds of lives lost in historical crashes. They analyze the failures, extract takeaways, and implement changes. Because of this, flying is now incredibly safe.

(Shortform note: In The Art of Learning, tai chi world champion Josh Waitkzin argues that we improve only as quickly as we learn from our mistakes. He suggests that if you never repeated a mistake—learning from it after one instance—you’d grow incredibly fast. While this is difficult in practice, Waitzkin advises that setting aside your ego can help. If we’re too focused on appearing competent or we’re embarrassed to admit mistakes, we’ll never learn.)

How to Create a Learning-Oriented Organization

Syed refers to psychologist Carol Dweck’s ideas of “growth mindset” versus “fixed mindset,” arguing that how we view failure determines whether we learn from it.

Syed argues that having a fixed mindset—believing that ability is set in stone—correlates with fragile self-esteem. Because failure could reveal your inadequacies, you’ll avoid it for fear of looking foolish or incapable. Regarding failure, people with a fixed mindset are threatened by failure and do everything they can to avoid it.

In contrast, having a growth mindset—believing that ability is malleable—means you understand that errors are intrinsic to learning. So you’ll regard difficulties, like a challenging work project or struggling relationship, as chances to learn. Regarding failure, people with a growth mindset believe failure is to be expected, and it’s a tool.

(Shortform note: In Grit, Angela Duckworth argues that innate talent doesn’t predict success. Instead, “grit”—a combination of resilience and direction—better determines who sticks it out until they reach success, and who drops out early. Like growth mindset, grit depends on the belief that we can learn from our mistakes. The stronger your belief, the grittier you can get.)

Mindset Determines Organizational Culture

As with individuals, an organization can exhibit a growth mindset or a fixed mindset, and that affects its culture. Crucially, an organization’s systems can’t change until its culture does. Since mindset shapes culture, we first need to change our mindsets.

An organization-level fixed mindset creates a culture of fear and blame. Since learning isn’t part of the company’s “DNA,” few mistakes come to light and people generally fear errors.

An organization-level growth mindset creates an open, collaborative culture. Since the founders and employees all understand that mistakes are intrinsic to learning, they’re comfortable erring, sharing those errors, and learning from them.

(Shortform note: In Tribal Leadership, Dave Logan argues that the best way to upgrade an organization is by upgrading its tribes. Each tribe within an organization exhibits a cultural stage, and the lower the stage, the less effective the tribe. To upgrade your organization is to upgrade your tribe’s cultures. At a higher stage, the mindset is, “work is better when we work together,” a teams-focused attitude that complements Syed’s focus on establishing an organizational growth mindset.)

Handle Mistakes Fairly

To create a culture that learns from failures, you need to handle mistakes justly. This creates a culture where employees can trust the higher-ups—the people who decide what happens when mistakes occur—because they’re neither too lax nor too harsh when reacting to mistakes.

Fair leadership prevents an environment of fear by thoroughly investigating any error. When genuine mistakes occur, like forgetting to file a report properly, the person can learn from it. When genuine negligence occurs, like a surgeon coming into work on cocaine, leadership must mete out an appropriate punishment. This helps employees feel comfortable making mistakes, while ensuring that workers remain diligent and honest.

Make Difficult Decisions That Serve the Tribe

In Tribal Leadership, Dave Logan and coauthors suggest that a great leader must understand how to put the tribe first under difficult circumstances. Acting from the tribe’s core values and shared vision, she might need to put big egos in their place, part ways with employees that no longer align with those values, or shut down stagnating projects.

The key is to listen thoroughly to the tribe, much like Syed suggests. By paying attention to the range of perspectives people bring to the table and noticing whether they’re acting in the tribe’s best interests or their own, a leader learns to “sniff out” self-interested individuals who could threaten the tribe’s success. This way, she maintains the tribe’s trust by showing that she has its best interest at heart, even when she needs to make tough choices.

Create a Learning System

After changing an organization’s culture, the final step is to change its systems. Syed offers the following three ideas:

1. Create better feedback mechanisms: Since many organizations lack a built-in error signal, a key step is to create that feedback mechanism. For example, local governments could hold weekly feedback sessions with community members to learn how they’re doing on a week-to-week basis.

(Shortform note: An easy way to gather feedback is to record video of what you’re doing. Josh Waitzkin writes in The Art of Learning that he recorded his tai chi practice sessions to identify mistakes and refine his form. Similarly, imagine recording video of a community meeting, then holding a retrospective with key players to analyze how things went and what you could do better.)

2. Use pilot programs: In politics, Syed suggests running pilot programs—small-scale tests of policy changes or business ideas that allow you to get feedback safely and inexpensively. Set up pilots that mimic real-world conditions, and avoid priming them to succeed. By staffing them with a mix of experienced and inexperienced employees, you’ll get a sense for how an actual team would perform under your program.

(Shortform note: In 2019, the US Congress passed the Foundations for Evidence-Based Policymaking Act, which stipulates that lawmakers and federal agencies must begin developing statistical evidence to support policies. While evidence-based pilots were once thought to be too expensive or time-consuming, the rise of agile methods in software engineering and in the broader business world has shifted that attitude. Now, numerous federal agencies, from the FDA to the Department of Labor, have begun running pilots, much like Syed suggests.)

3. Use “pre-mortems” to strategize: Before launching projects, Syed suggests running a “pre-mortem,” which improves outcomes by helping you discover the flaws in a plan before implementing it. Ask your team to imagine that your plan has failed completely. Then, work backward by asking what went specifically wrong and adjust the plan accordingly.

(Shortform note: In The Obstacle Is The Way, Ryan Holiday suggests that pre-mortems are also a useful tool for personal growth. Hardships, he says, are inevitable. By expecting that bad things will happen sooner or later, we better prepare ourselves for them. And by actively anticipating what could go wrong (such as with your workday, a date, or a difficult conversation) you improve your chance of success. Specifically, he suggests thinking up as many possibilities for failure as you can, and then creating a contingency plan for each.)

Shortform Introduction

In Black Box Thinking, Matthew Syed argues that learning from failure drives human progress. It’s how we adapt to an ever-changing world, how we innovate in science and business, and how individuals and organizations improve. While modern society benefits from this tradition of rational learning (inherited from the ancient Greeks and the European Enlightenment), Syed explains that we still need to apply rational, failure-driven learning to our social institutions.

Namely, our politics, courts, and hospitals stigmatize failure and perpetuate a culture of false exceptionalism that prevents them from improving. If we neglect to improve our stagnating institutions, they’ll continue to take the lives of innocents—whether it’s death due to preventable medical error or wrongful conviction that leads to life in prison.

Syed explains how certain organizations have distinct cultures and systems that promote learning, while others’ prevent learning. Fortunately, any organization can change its view of failure to transform its culture and systems, thus becoming a learning organization.

About the Author

Matthew Syed is a British journalist and author, as well as a former table tennis player. As the number one English player through the 1990s, he represented Great Britain in the 1992 (Barcelona) and 2000 (Sydney) Olympic Games.

In an interview with F1 Champion Nico Rosberg, Syed discusses how his loss of a key Olympic match directly sparked his desire to examine mindset and performance psychology. His background as an athlete informs the examples in Black Box Thinking, from F1 Racing to soccer and cycling.

Since the end of his sporting career in 2000, Syed has written for The Times, a London-based newspaper, and authored six books. Today, he also commentates for the BBC and Eurosport, mainly on sports, culture, and politics. He works with businesses to develop organizationwide growth mindsets—a topic he covers at length in Black Box Thinking— through Matthew Syed Consulting.

Syed has received numerous awards for his work, including:

Connect with Matthew Syed

The Book’s Publication

Black Box Thinking was published in 2015 by Penguin Random House. This was Syed’s second book, after his debut Bounce—an international best seller on the role of talent and practice in success. Syed has since written two adult nonfiction books, The Greatest (2017) and Rebel Ideas (2019), and two children’s books—You Are Awesome (2018) and Dare to Be You (2020)—to extend his ideas about learning and growth to a younger audience.

The Book’s Context

Historical and Intellectual Context

In Black Box Thinking, Syed draws from the work of Francis Bacon, a 17th century English philosopher who helped resurface the scientific tradition begun in classical Greece. Syed situates his argument as a continuation of this historical discourse on rationality, firmly supporting an empirical approach to learning and growth.

He also references Karl Popper, an influential 20th-century philosopher of science who argued for an empirical approach to science and opposed inductive reasoning (inferring general laws from observations of a phenomenon). This perspective runs throughout Black Box Thinking, where Syed argues for an empirical, data-driven approach to learning from failure.

In short, Syed thoroughly supports the view that reason à la the European Enlightenment is humanity’s way forward. Steven Pinker, author of Enlightenment Now and Rationality, is perhaps the most vocal public intellectual espousing this view—that reason drives human progress, peace, and knowledge.

Black Box Thinking also overlaps with well-known books such as Grit (Angela Duckworth), Mindset (Carol Dweck), and Outliers (Malcolm Gladwell), with their shared themes of resilience, trial-and-error learning, and mindset-based success.

Finally, Syed introduces a nuance to learning from our mistakes—that neglecting to learn from failure is tantamount to moral irresponsibility. Specifically, he argues that it hinders the development of more effective social institutions, and that society must adapt through failure to keep pace with technological change.

The Book’s Impact

While Black Box Thinking didn’t reach the same heights as his previous book Bounce (an international bestseller), it led directly into Syed’s subsequent projects: In an interview, Syed explains that parents reached out to praise Black Box Thinking and express their concern that their children needed to hear his message. In response, Syed wrote two children’s books titled You Are Awesome and Dare to Be You to bring his ideas to the next generation.

The Book’s Strengths and Weaknesses

Critical Reception

Online reviewers praise Syed’s writing style as captivating, with well-told and broad-ranging stories to illustrate the ideas. Some call Black Box Thinking clear and concise, and they explain how its message has positively impacted their own relationship to failure.

Reviewers who disliked the book complain that the main argument comes in the first two chapters, and the rest is repetitive. Others find Syed’s thesis obvious, saying that the book offers nothing new on the “why” of learning from failure, nor enough detail on the “how.” Some readers also complain that Syed’s examples include irrelevant details, causing the book to become unnecessarily fluffy.

Commentary on the Book’s Approach

Syed builds his argument narrative-first: Each chapter features anecdotes that illustrate the main ideas, followed by a discussion of the principles. Many of these effectively demonstrate the argument, and Syed grabs the reader’s emotions with high-stakes tales of real-world failures and learning (or lack thereof).

Black Box Thinking contains a substantive argument for the importance of learning from failure as individuals and organizations. Despite this, Syed often repeats the main point and the anecdotes begin to feel repetitive, adding only marginal value.

Toward the end, Syed takes on a mildly polemic tone, exhorting the reader about the gravity of the issue. This helps to drive his argument home, given his solid reasoning about our lack of empirical improvement systems in many areas of society. On the other end, the book ends with a collection of actionables that feel tacked-on, rather than well-developed.

Commentary on the Book’s Organization

Black Box Thinking unfolds in six parts of two to three chapters each. Each part broadly discusses a component of Syed’s argument, ranging from the logic of failure (Part 1) to putting the ideas into action (Part 6).

Within each part, the structure is somewhat unclear. Each chapter features three to five sections that loosely treat distinct ideas, but the prose returns to the main point: Learn from your failures. Because of this repetition, it’s difficult at times to identify and order the sub-points of the argument. In other words, the over-emphasis on Syed’s main point obscures the nuances of his argument.

Since each major part covers different aspects of the argument, it’s easy to dip back into the book to revisit various sections. This helps offset the difficulty of parsing Syed’s argument amidst the sea of anecdotes.

Our Approach in This Guide

We’ve reorganized Black Box Thinking’s six parts into four parts, clarifying and distilling the key arguments. In Part 1, we introduce the book’s inquiry—how and why to learn from failure, as individuals and organizations—and explain the historical context of learning from failure. In Part 2, we discuss organizations that learn from failure and how they do it. Conversely, in Part 3, we discuss organizations that don’t learn from failure and the dangers that such neglect presents. Finally, Part 4 explains the changes that help us embrace failure-based learning as individuals and organizations.

Here’s how our guide maps onto the book’s chapters:

Given Syed’s focus on organizational development, we’ve buttressed his perspective with ideas from The Fifth Discipline by Peter Senge, which discusses how companies can use problem solving and systems thinking to become “learning organizations.” We’ve also brought in actionables from Tribal Leadership, by Dave Logan et al., which discusses how to improve your organization by upgrading its culture, dovetailing with Syed’s argument that an organization’s culture must change before its systems can.

Part 1: Failure and Human Development

Black Box Thinking is about learning from failure. Syed argues that learning from failure drives the growth of large organizations, underpins innovation in business, and helps us become resilient, growth-oriented agents. Airlines exemplify this mindset and method, using “black boxes” to collect data, analyze crashes, and learn from them.

However, we still fail to learn from our mistakes in multiple key areas—from health care to criminal justice—and this endangers lives and resources.

Matthew Syed is an award-winning journalist, author, and former Olympian for Great Britain’s table tennis team. He’s worked as a journalist for over two decades, and he’s written four books focused on mindset, performance, and growth. Syed also works as a consultant to help organizations build growth-focused cultures and foster continual improvement.

In this guide, we’ll explain Black Box Thinking in four parts. Part 1 introduces Syed’s argument that success comes from failure and explains the history of humanity’s attitudes toward failure. Part 2 explores the institutions that learn from failure—how, why, and the benefits they reap—while Part 3 discusses institutions that neglect to learn from failure and the problems this causes. To end, Part 4 explains how to embrace failure as positive, and how to develop a learning organization.

We’ve clarified and expanded Syed’s ideas with perspectives from The Fifth Discipline: The Art and Practice of the Learning Organization, by Peter Senge, and from Tribal Leadership, by Dave Logan et al.—both of which treat organizational development in detail.

We Develop Through Failure

The core of Syed’s argument is that, as individuals, we can only improve by learning from our failures. On the flip side, neglecting to learn from mistakes means that you can’t improve. For example, a dancer who takes every mistake as a chance to grow will get better, while a dancer who ignores or denies her errors will remain static.

This also applies on the institutional level: Organizations that learn from failures iron out systemic flaws and improve their performance. Organizations that ignore their mistakes will continue to make them, risking stagnation.

(Shortform note: In The Art of Learning, Josh Waitzkin argues that learning from your mistakes is the key to competitive success. When you make an error, note where you went wrong and shore up that weakness by training yourself not to repeat it. He also recommends actively seeking out opportunities to fail badly and repeatedly, because this humbles you and helps you approach learning with a beginner’s mind. In a way, this is what learning organizations do: They embrace that what they do know is less than what they don’t know.)

From here, Syed argues that we need to spread failure-based learning throughout modern society. Neglecting to learn wastes lives and resources, such as the tens of thousands of annual deaths from preventable errors in hospitals and the thousands of wrongful convictions in the court system. (Shortform note: To clarify, Syed bases his argument around the American health care and court systems, though he also cites some examples from the UK’s hospitals.)

Progress—our ability to develop our knowledge, our societies, or our technology—is a hallmark of human civilization. We develop by learning from failure, but so long as such major institutions neglect this opportunity, we won’t progress as smoothly.

(Shortform note: Steven Pinker, a Canadian-American psycholinguist, argues in Enlightenment Now that reason is the primary driver of humanity’s progress. He asserts that reason, science, and humanism have yielded longer life spans, better food production, greater wealth, more peace, democracy, individual rights, and more. While Syed doesn’t explicitly define “progress,” Pinker suggests that we should aim to continuously improve human flourishing, in “life, health, happiness... richness of experience.”)

The History: Humanity and Development

To contextualize Syed’s argument, we’ll explain how human cultures have related to failure and learning throughout history. (Shortform note: While Syed presents this history in a “coda” after the main book, we’ve chosen to present it first—without the history, it’s less clear why failure-based learning matters. Note that this perspective emphasizes Western history.)

Syed traces the development of human cultures through three broad phases:

Let’s look at each phase more closely.

Phase #1: Defending “Divine Knowledge” (Ancient past to ~600BC)

Human cultures have always had worldviews—ideas about how the world works—but they haven’t always promoted learning. When we lived in hunter-gatherer tribes and in the early civilizations that followed, many worldviews were dogmatic and thought to come from the gods—which meant they weren’t to be challenged.

For example, Chinese paganism holds that the Dragon King (龍王 Lóngwáng) controls the rains and the creatures of the seas; no one would challenge this belief for fear of invoking his wrath and losing loved ones at sea.

(Shortform note: While we often hear about the negatives of tribalism, tribes helped humans to cohere, coordinate, and survive for thousands of years. Today, psychologists acknowledge that tribes give people a sense of belonging, a better understanding of their purpose, and a valuable social support network. So while dogmatic beliefs might negatively affect societal progress, they aren’t all bad. Overcoming the negatives of tribalism likely means embracing the positives—forming tribes is in our DNA, whether we like it or not.)

Primarily the “priest class” defended these myth-based worldviews, treating them as sacred and infallible wisdom. In addition, the power elites would suppress dissenters with violence or banish them, Syed says. This kept their “truth” undisturbed from generation to generation.

For example, Jesus of Nazareth—the historical Jesus—was crucified because the ruling Romans saw his novel religious teachings as a threat to their established order (Pagan religion, Roman law and ethics, and so on).

Because those in power treated alternative perspectives as threats, they lost the opportunity to learn. In other words, their worldviews remained static because they didn’t acknowledge the possibility that they had flaws.

(Shortform note: While powerful leaders have always had their sycophants, modern echo chambers exacerbate the problem. Because our information streams are often personalized—think your curated social media feeds—and because we surround ourselves with like-minded thinkers, it’s easier than ever to inadvertently avoid alternate perspectives. This causes exactly the stagnation that Syed describes: When nothing challenges the dominant view, a group’s thinking will fall into a rut. To avoid this, actively seek out people who may disagree with you, and read reputable information from both sides of an issue.)

Phase #2: Rational Thought Arrives (~600 BC to ~300 BC)

This “divine knowledge” paradigm didn’t last forever. With the dawn of the classical Greek period, thinkers such as Pythagoras, Socrates, and Aristotle developed rational thought. They treated ideas as speculative and developed them through a back-and-forth exchange of differing perspectives.

Under this paradigm, Syed explains, the Greeks embraced errors as integral to learning. When discourse revealed the flaws in an argument, the thinkers would seek to improve it. In this way, the tradition of dogma began to fall away, and rational, scientific thought began to emerge.

Rationality Versus Empiricism

Although Syed conflates rationality and empiricism, they’re distinct approaches to scientific thought. Both have their roots in the classical Greek period he describes, and they have long challenged each other on the proper way to establish valid knowledge.

In short, empiricists hold that all knowledge derives from external sensory experience (our five senses) and internal sensory experience (observation of mental phenomena). In other words, you come to know something by sensing it, and in that way discover the laws of the natural world.

In contrast, rationalists argue that reason enables us to intuit and deduce concepts that have no basis in sensory reality (a priori knowledge). For example, a rationalist might say that a logical statement, “13 is a prime number less than 22,” couldn’t come from sensory experience. Rather, it’s an abstract formulation of knowledge that depends on intuited insights about numbers and mathematics.

Rationalists and empiricists tend to agree that some knowledge comes from such abstract reasoning, though other knowledge undeniably comes from sensory experience—for example, you know water and wetness by its feeling. If we “both-and” the two schools of thought, it’s clear that both abstract reason and sense-based reason have a place in scientific thinking.

Phase #3: The Church Battles Rational Thought (~300 BC to ~1600 AD)

Though rational thought briefly flourished, the Western worldview receded into dogma after the classical Greek period. Syed explains that the church combined Christian strictures with a dogmatic interpretation of Aristotle’s work, and it punished dissenters.

This resurgence of the “divine knowledge” paradigm dominated until around the 17th century, when thinkers (Syed chiefly credits Francis Bacon, who helped develop the scientific method) revived the scientific mindset. Though they met resistance—Galileo, for example, was put under lifetime house arrest for challenging Christian cosmology—the rational tradition found its footing and scientific progress resumed.

The work of Galileo, Bacon, and others led into the Age of Enlightenment in the 17th and 18th centuries—Europe’s return to rational, scientific thought.

(Shortform note: In any age, the dominant worldview tends to become more or less dogmatic. Today, science is partially guilty of this, as the church was in medieval times. “Scientism” is a quasi-religious view, held by some scientists, that purports science is the only valid way to know things. In “The Folly of Scientism,” Austin Hughes argues that scientism constitutes an attempt to extend the scientific method into areas of philosophy that it simply can’t comment on. This unflinching belief in the ultimate power of science, he says, is much like religious superstition. To counter this trend, he argues—as Syed describes above—that we need a resurgence of reasoned, Enlightenment-style thinking that clarifies the strengths and limitations of scientific methods.)

Top-Down vs. Bottom-Up Learning

Syed describes the “divine knowledge” paradigm as “top-down” learning, while the scientific, adaptive method is a “bottom-up” approach.

Top-Down

The “divine knowledge” paradigm is a top-down approach: Authorities decree the truth from on high, and it’s not to be challenged. Armchair philosophy exemplifies the top-down approach—a small group of elites with special knowledge build grand, esoteric theories to try and capture the “truth” about something.

(Shortform note: In Mastery, Robert Greene explains that top-down hierarchies lead to conventionalism, wherein people adhere to an existing set of rules, written and unwritten, because to do otherwise is socially risky. By conforming to the group, you ensure that you’ll be conventionally successful. But conventionalism doesn’t lead to meaningful discoveries or innovation, because by definition it doesn’t challenge the status quo. Greene asserts that we need to resist conventionalism by aligning to our own true north, instead of the group’s, and by developing social intelligence to handle doubters until we’ve proven our worth.)

Syed explains, however, that the top-down approach causes knowledge to stagnate because it goes untested. Without testing, you can’t accurately map out how things work—the world is too complex to create a perfect theory on the first try. Top-down implementation of Marxism exemplified this problem. For example, Mao Zedong’s 1958 Great Leap Forward was a massive top-down restructuring of China’s agricultural base that disregarded testing in favor of Mao’s ideology. In failing to understand the real capacities of workers, soil, and crops, they caused a massive famine that killed tens of millions.

Even when tested, top-down theories impede swift adaptation because they’re too complicated to efficiently rework. For example, Mao’s program demanded massive, widespread action but it was too large to adjust when things started going wrong. Reworking something of that scale takes time, which delays action and wastes resources.

(Shortform note: Mao’s Great Leap Forward was a “five-year plan,” the classic Soviet planning strategy that involves setting fixed goals for economic growth and achieving them through top-down control. Five-year plans have had mixed success—China’s first, for example, succeeded in building an industrial manufacturing base out of a largely agricultural society. Early Soviet plans, on the other hand, caused mass famine in Ukraine due to top-down suppression of farmers and Ukrainian nationalism.)

Bottom-Up

Syed asserts that by contrast, learning from failure is a bottom-up approach: You take small steps with limited knowledge (as opposed to grand, top-down action), inevitably make mistakes, and adjust accordingly. For example, market economies feature many agents that work bottom-up to build businesses. This allows the economy to grow organically: Each business steps toward success, makes mistakes, and adjusts, until the best succeed and others fall away.

The key to bottom-up learning is testing. Each time you take action, test whether your knowledge fits the situation or not. For example, a smart business owner validates product ideas through testing—introducing her ideas to potential customers and noting how they respond.

By testing, you learn what won’t work and can adjust accordingly. For example, maybe that business owner found that her product would enter a saturated market with established alternatives. By testing, she’s avoided potential failure and can adjust her strategy according to what she learned.

In other words, bottom-up learning allows for swift adaptation, according to Syed. You fail fast and often, building practical knowledge through real-world experience.

Undertake a Practical Apprenticeship

In Mastery, Robert Greene explains that top-down education (conventional schooling) doesn’t prepare us for the real world because it doesn’t give us practical, real-world experience. To overcome this, he suggests you seek out an apprenticeship: An intense effort to explore your field of interest, develop essential skills, and gain practical knowledge.

Like Syed, he advocates for an open, experimental mindset. By taking one step at a time into the unknowns of your field, you’ll make small mistakes that you can learn from. Aim to adapt your learning experience to:

It’s essential to directly engage with these realities, because attempting to understand them via top-down theorizing will never yield an accurate picture. Instead, take advantage of being a beginner: When you’re new, people will understand that you need to make mistakes to adapt to your field.

Practice Influences Theory

Building on the importance of testing in bottom-up learning, Syed argues that our standard view of progress—that scientific theories enable technological innovation—is backward. Instead, Syed suggests that trial and error experimentation leads to practical know-how, which in turn influences theory.

For example, Thomas Edison developed a method of recording sound by experimenting with telegraph technology. Though he lacked precise theoretical knowledge of the invention beforehand, he reached a working phonograph through trial and error.

In the end, Syed argues that theory and practice reciprocally drive change in a complex interplay of influence. In other words, top-down and bottom-up approaches complement one another, though he doesn’t explain how.

(Shortform note: In Mastery, Robert Greene asserts that chance plays a crucial role in scientific discoveries, and he suggests priming your life for such serendipity: Our brains naturally seek out connections between ideas, and we can amplify this effect by surrendering control. When we’re focused, he argues, our attention narrows and reduces the range of ideas we’re engaged with. To counteract this, take two steps: First, widen your information intake to include sources you don’t normally use. This can stimulate new connections. Second, take plenty of time to rest, relax, and engage in casual activities, like sport or music. This keeps your mind flexible and open to new ideas.)

Where We Are Today

In modern society, Syed writes that some institutions rely on bottom-up learning, including medicine, the airline industry, athletic organizations, and some businesses. Meanwhile, others still operate on the “divine knowledge” paradigm—Syed highlights health care and the courts.

In Parts 2 and 3, we’ll examine how he compares and contrasts these two institutional styles. (Shortform note: We’ll use “institution” to refer to an area of society, and “organization” to mean a specific agent within that institution—for example, we have the institution of public health care, and organizations like Griffin Hospital. When we say that an institution neglects to learn, we mean that the organizations within that institution don’t learn, as Syed argues.)

1. Learning-Oriented Institutions: The organizations within a learning-oriented institution (Syed’s “open loop systems”), such as lean startups in the business world, use failures to improve their operations and innovate solutions to tough problems.

For example, a successful football team will hold post-game retrospectives so coaches can reflect with players on what went well and what went poorly. This clarifies what to improve on, using failures as fuel for success.

(Shortform note: In The Fifth Discipline, Peter Senge describes a “learning organization” as an organization where people embrace open-minded thinking, aspire to their creative potentials, and continually learn together. He outlines five disciplines to achieve this: 1) Create a shared vision to align leadership and staff, 2) Use systems thinking to understand your organization holistically, 3) Clarify your existing mental models and stay open to adapting them, 4) Create teams that learn together and embrace honest mistakes, and 5) Help employees develop personal mastery by training the subconscious mind to handle complexity.)

2. Failure-Averse Institutions: The organizations within a perfection-oriented institution—such as hospitals in the health care system or certain political institutions—neglect to learn from failure. This happens because their cultures preclude the acknowledgment and assessment of mistakes. In other words, to admit a mistake was made would damage their reputation, so they bury it. As Syed says, this attitude reflects the “divine knowledge” paradigm and perpetuates harmful, preventable errors.

For example, Donald Trump’s administration routinely expelled dissenters and he denied his fallibility. Since Trump inadvertently pushed away anyone that could’ve helped correct his administration’s missteps, they were unable to operate effectively.

(Shortform note: Mary Trump, the niece of Donald Trump, explains in Too Much and Never Enough that Donald Trump has been enabled since a young age. Through analysis of Trump family history and her own experiences, she argues that he’s behaved poorly for decades and never faced repercussions. Because he’s never been punished for mistakes or wrongdoings, he can’t conceive that he could be wrong about anything, and so he assumes that someone else is to blame. Any institution with a leader who’s so apt to cast blame becomes failure-averse, since any perceived weakness could get you fired.)

Two Types of Mistakes

Syed defines two types of mistakes and distinguishes how each type of institution employs or responds to these mistakes.

Type #1: Inadvertent mistakes. These occur when you’re aware of the correct answer but, through some accident or misstep, still make the error. For example, a football quarterback can mistime his pass and get sacked, even though he knows to avoid that.

Syed explains that inadvertent mistakes help us find and improve the flaws in existing systems. Consider how an inexperienced manager can inadvertently help develop a better team because his mistakes expose opportunities to improve.

(Shortform note: While these “human errors” are often understandable, it’s possible to reduce them and improve your operations. Often, we make more mistakes when we’re under a lot of stress or have to focus for long periods of time. After a while, our performance slips because it’s too hard to keep up an optimal effort. To overcome this, work in a rhythm of focused periods and rest periods. Josh Waitzkin recommends in The Art of Learning that you learn to fully relax, explaining that it rejuvenates you for your next effort.)

Type #2: Intentional mistakes. These happen as part of an effort to develop solutions—if you don’t know the answer to a problem, Syed says, trial and error is a great way to find it. For example, car manufacturers have developed more efficient engines by iterating through their less efficient predecessors.

Intentional mistakes help us innovate creative solutions to tough problems. You’ll want to make these mistakes when you’re iterating a product idea, developing a theory, or navigating through thorny behavioral changes.

(Shortform note: The Harvard Business Review writes that “deliberate mistakes” can reveal assumptions in your models—for example, an advertising strategy you expect to fail might actually do well. By testing it (even though you anticipate failure and financial losses), you could discover a strategy that goes against your existing assumptions. This matters because we often seek only to confirm our existing ideas. By making deliberate mistakes, you prevent that confirmation bias.)

Although both types are productive mistakes—regardless of why they were made, they provide opportunities to learn and improve—failure-averse organizations mainly make inadvertent mistakes and fail to learn from them. Because of this, they remain stagnant and continue to make the same errors. For example, a local political campaign that neglects a key voting demographic, despite repeatedly losing without their votes, will continue to err and lose.

On the other hand, learning-oriented organizations make both types of mistakes and learn from them. This enables them to improve their systems over time—for example, the rival political campaign may have noticed where they weren’t getting enough votes, given those areas attention, and done better the next time around.

(Shortform note: The structure of a learning-oriented organization is fundamentally different from that of a failure-averse organization in that the latter is optimized for performance rather than learning. The paradox is that learning organizations often perform better than those stuck on set performance goals—because if you continually strive for static goals, instead of learning and shifting your measures as you go, you might never reach them. If you or your organization tends toward static measures of performance, try introducing this learning mindset within a project you can lead: Focus the team on learning from mistakes, rather than achieving set goals.)

Part 2: Learning From Failure Is Advantageous

In Part 1, we explained that learning from failure drives progress, and that human cultures have moved from “divine knowledge” worldviews to a scientific, rational perspective that is developed through errors.

Now we’ll look at why adapting through failure is so effective, and how learning-oriented institutions learn through bottom-up adaptation. Syed explains that these organizations often operate in private, performance-focused areas—aviation, athletics, and business—with the exception of the scientific establishment. Further, these organizations have two things in common: a cultural attitude that enables them to learn from failure, and systems that make use of those mistakes.

In Part 2, we’ll explore why a learning-oriented system must arise from a culture that embraces errors, and we’ll discuss how to learn from failure. Specifically, we’ll lay out the basic process and three additional techniques that help deal with complex challenges: randomized control trials, incremental development, and insight generation.

Element #1: The Culture of Learning-Oriented Institutions

Syed contends that to learn from failure, an organization’s culture must first view it as an opportunity to grow. This enables individuals and teams to find mistakes and learn from them. This mindset must exist on two levels:

On the individual level, you need the desire to improve: to work hard, fail, and try again. You need tenacity to keep going despite numerous mistakes. And you need humility to admit your mistakes and learn from them.

For example, think of the long work it takes to become a professional musician. A cellist spends thousands of hours training her craft by making mistakes, adjusting, and pressing onward.

(Shortform note: In Tribal Leadership, Dave Logan and coauthors argue that in large, corporate workplaces, employees lose motivation and productivity when they feel like cogs in a machine. To build an organizational culture that values strong individuals, Logan recommends helping each individual develop confidence in their abilities. To do this, point his strengths out and give him reasonably challenging tasks that play into those strengths. By completing them, he’ll better appreciate his abilities and may seek additional responsibilities.)

At the organizational level, you need an open and fair culture that incentivizes learning from failure. When people feel that they’ll be rewarded for failing upward, they’ll happily do so.

Continuing the previous example, the cellist doesn’t fear messing up in practice because her culture embraces mistakes as part of learning. Her teachers, conductors, and fellow musicians are all masters of failure, so she too feels incentivized to work hard, make mistakes, and get better.

(Shortform note: Syed’s description of an effective organizational culture evokes the Stage 4 cultural stage in Tribal Leadership. Dave Logan explains that at Stage 4, teams focus on values, vision, and alignment more than top-down management and control. The tribe judges others’ actions according to those values, allowing for experimentation and failure so long as the mistakes are made in pursuit of their shared vision. This encourages productive mistakes—those which further the tribe’s goals.)

Element #2: The Systems of Learning-Oriented Institutions

With a failure-embracing mindset in place, you need a system that makes use of failure. Syed argues that this system is the bottom-up method we described in Part 1. In other words, scientists, lean startups, and athletic organizations all iterate to develop theories, products, and skills. Here’s how it works:

Step #1: Take action. These organizations take action based on their best knowledge so far. As they do so, they gather data. By measuring their operations, they gain feedback to learn from if things go wrong.

Syed explains that this data gathering is the role of black boxes in aviation. Each plane is equipped with two black boxes: One records details of the plane’s positioning, trajectory, and functions (including the landing gear and lights), and the second records human data—such as the conversation in the cockpit.

Step #2: Analyze failures. When an error occurs, these organizations analyze what went wrong. Using all the data available, they look for the cause of the error—reflecting on what went wrong and why it went wrong.

For this step, airlines use private investigators who work through all the data from the black boxes. They analyze the circumstances of the crash until they know what went wrong, and they give that information to the airline.

Step #3: Formulate fixes. Next, these organizations figure out practical, scalable solutions to the problems identified in Step 2. Then they test these solutions to iron out any flaws before implementing at scale.

For example, after identifying an issue with pilots losing focus after long hours flying, analysts recommended distributing the workload more evenly between the pilot and co-pilot. Airlines tested simple training programs to teach this and implemented them once they’d smoothed out any flaws.

Step #4: Repeat. Finally, these organizations integrate what they’ve learned into their systems. Each time they do this, they accumulate a better understanding of what doesn’t work and what does. Then, they return to step one and keep learning.

Act Fast and Learn Constantly

In The Obstacle Is The Way, Ryan Holiday outlines a similar method that emphasizes frequent, aggressive action:

Holiday’s methods embody the scrappy, experimental mindset that’s critical to succeed with trial-and-error learning. Without that consistent drive to improve, it’s easy to fall into a rut and stagnate—so aim to develop this mindset alongside these learning methods.

Syed argues that this method works so well because it’s not just some human invention, it’s embedded in nature. In fact, it’s how evolution works:

(Shortform note: While evolution-based learning is powerful, we can use it ineffectively. One study on science publishing found that because many academics depend on publication to advance their careers, researchers produce conventional work that doesn’t push boundaries or advance scientific methodology. Publishers reward research that doesn’t rock the boat, so these conventional research labs succeed and spread their methods to trainees. So while natural selection is at play, it selects for the most adaptive behaviors given what’s rewarded—not necessarily the optimal behaviors in an absolute sense.)

Succeeding with this strategy isn’t easy—you have to fail, fail, and fail some more. As we discussed above, it takes tenacity, humility, and ambition to slowly adapt your creation according to what actually works.

(Shortform note: Ryan Holiday argues in The Obstacle Is The Way that to make it in our complex, uncertain world, we need to cultivate a deep resilience to hardship. You can do so by resolving to achieve some goal no matter what, and sticking to that commitment at all costs. Repeatedly sticking to your commitments builds long-term resilience into your character. Similarly, Nassim Nicholas Taleb argues in Antifragile that we ought to become not resilient, but antifragile—rather than merely enduring stress, use it to grow ever stronger. Both individuals and organizations can take hardships as lessons to learn—similarly to how muscles grow through the minor injuries of exercise.)

Iterate Minimum Viable Products

Syed gives one key example of the iterative process in business: the minimum viable product (MVP). To navigate the unpredictable marketplace, lean startups test out their ideas using stripped-down versions of the potential end products.

  1. Test out your ideas by building an MVP, and see what happens. Imagine that a startup founder wants to create a new social platform that emphasizes empathetic conversation and tight-knit social groups. To test this idea, she could mock up a video that demonstrates how these features would work.
  2. Launching an MVP gets you feedback—whether people are interested and, if so, what they do and don’t like about your product. After launching her video, the founder might find that only some social groups are interested, but they have concerns about whether or not social media is right for deeper connection.
  3. Adjust to that feedback—the founder’s idea found little support, but she’s only sunk a little time and money into the MVP—as opposed to a great deal of time and money into a full-blown product. Now she can decide whether to explore other options. She’s learned what won’t work and can take her ideas in a new direction.

Test Internally to Validate Ideas

While jumping right to action can work, it’s often better to test ideas internally before seeking customer feedback via an MVP. In Inspired, Marty Cagan explains that a software product idea should pass four tests before your company invests in it:

While you can’t ensure 100% certainty about these factors, you can separate a good deal of chaff from the wheat. This way, you avoid investing in ideas that you haven’t validated and improve your chances of success. Further, be sure to tap the whole team’s expertise—a software engineer can speak to product feasibility, but the CEO will better understand whether an idea fits the company’s mission and strategy.

Beyond this basic iterative process, Syed offers three techniques that help to clarify feedback, deal with complexity, and innovate new solutions when marginal improvements no longer work.

Technique #1: Clarify Feedback

While iteration à la the MVP is often effective, feedback is sometimes misleading. For example, if you try to test the effect of coffee on college students but give it to every test subject, you’ll have no basis on which to contextualize the results.

To combat this, Syed argues, you need to run a randomized control trial (RCT). By comparing a control group to the test group, you clarify the effect of what you’re testing.

(Shortform note: While randomized control trials didn’t come about until the 20th century, past doctors have used similar testing methods. In 1537, the French surgeon Ambroise Pare inadvertently conducted a clinical trial on wounded soldiers he had treated. After running out of burning oil with which to cauterize their wounds, he applied a substance made of egg yolks, rose oil, and turpentine. The next day, he discovered that the latter treatment worked far better than the oil cauterization—though he didn’t go on to formalize this accidental testing procedure.)

Here’s how an RCT works: Experimenters randomly divide participants into two groups, subjecting one to the experimental variable and giving the other a placebo (or nothing). The difference between the groups’ results reveals the actual effect of the experimental variable.

For example, a company developing a nootropic (brain-boosting) supplement would give the real pill to one group and a placebo pill to the other. Then they’d compare the groups’ results to determine the pill’s actual efficacy.

(Shortform note: In addition to using a placebo, researchers should conduct the trial as a double-blind. This means that neither the researchers nor the test subjects know whether they receive the treatment or the placebo. This prevents bias from skewing results: If researchers don’t know who has the treatment, their beliefs about its effectiveness can’t influence how they interpret the proceedings of the experiment.)

After testing, experimenters take the feedback and adjust their theory or product accordingly. Then they’ll repeat, keeping a tight loop between the feedback and the next iteration, until they reach a well-tested result.

Continuing the example, the nootropic company could use the data from testing to improve their product or decide whether it’s ready for the market.

(Shortform note: RCTs aren’t always the right solution, for a few reasons. If your sample size is too small (20 or fewer), the test can’t guarantee a strong effect because outliers can skew the data. Sometimes, an RCT isn’t ethical—for example, a test to determine the efficacy of an individualized tutoring program might favor some students while leaving others behind. Because it’s random, there’s no way to ensure that students with the greatest need receive help. Finally, RCTs only reveal simple causal effects: They can tell you that A → B. But real-world causation is often complex and nonlinear—for example, a pill may reduce your cholesterol while also throwing off your sleep patterns and causing hair loss.)

Technique #2: Break Down the Problem

Iteration and randomized control trials often work wonders, but some problems are too immense, too complex, or lack a viable control group. If you’re the US president, for example, you can’t run tests on the whole country—nor do you have a second country to act as a control group.

To handle this, Syed explains, you need to break the problem down into smaller challenges and accumulate small wins. For example, to fix a struggling economy, you’d first identify the smaller issues: runaway inflation, irresponsible financial speculators, a population crushed with debt, and so on.

(Shortform note: Anthropologists from the University of Georgia explain that throughout history, humanity has tackled societywide issues with social organizations and effective institutions. When bottom-up civilian efforts to fix some issue (like lung cancer) combine with institutional power and connections, we often make progress. In contrast, top-down solutions typically fail, since centralizing power reduces the range of perspectives and skills that people can bring to bear on the issue.)

Continue to chunk down the problems until you have something small enough to take action on. Then, start testing ideas. Maybe you test a free financial literacy course in one town, with a similar town as a control, instead of testing on the whole country. If it works, you can try to scale it; if not, you can adjust and try again.

Over time, many little successes accumulate into big results. Maybe the course doesn’t work until the 15th iteration, but that little success can scale and help solve the larger problem. Then move on to the next small problem, and the next, and keep building toward that larger success.

(Shortform note: This technique is visible in the World Bank’s efforts to end extreme poverty, a UN Sustainable Development goal. In a 2018 update, the World Bank outlines the progress they’ve made by breaking down the problem into regions and countries, as well as on specific benchmarks of extreme and lesser poverty. They also break down nonmonetary dimensions of poverty—such as access to clean water, electricity, and education—as another way to handle one chunk of the problem at a time.)

This technique works even better if you gather more data. Measuring more variables helps you find additional opportunities for small gains and enables you to make more precise adjustments. For the above example, you could poll people at the course’s end or track participants’ long-term financial well-being after the course.

The Origin of the Black Box

In 1949, a British airline called De Havilland began flying its new jet airliner, the Comet. This was the first modern passenger jet—but just a few years later, Comets began crashing. By 1954, seven Comets had disintegrated or crashed.

In investigating the crashes, Australia’s Department of Civil Aviation faced a lack of meaningful data. They had no recordings of flight data—speed, positioning, fuel levels—nor any recordings of cockpit conversations, so they couldn’t deduce the cause of the crashes.

One Australian researcher, David Warren, realized that if they had such recordings, they could better understand what had happened. Over the next four years, he developed a prototype black box to record flight data and the interactions between crew members in the cockpit.

His invention caught the eye of British aviation authorities in 1958, and it sparked a movement to integrate black boxes into all civilian airliners. By 1962, Warren had developed a standard model, and his technology soon spread around the world. This step—from having no data on crashes to over two dozen measures of flight data and in-cockpit happenings—enabled significant improvements in airline safety.

Technique #3: Find Opportunities to Innovate

Up to a point, you can iterate, test, and break down the problem. But eventually, your efforts will outweigh the gains from each marginal improvement. This is because surface-level iteration improves on an existing paradigm, but it doesn’t fundamentally change it.

To move further, Syed explains that you need to go beyond the “local maximum,” the highest point of improvement for a given paradigm. For example, you can only shoot an arrow so far with a shortbow, so you’d need to develop a new tool to surpass that distance.

(Shortform note: In Blue Ocean Strategy, W. Chan Kim argues that success comes from finding a “Blue Ocean,” or uncontested market. Syed’s “local maximum” corresponds to Kim’s “Red Ocean,” a heavily contested market. To move beyond that obvious, crowded arena, think beneath the surface-level function of products to discover the underlying function. For example, home gym equipment serves the same purpose as fitness classes or a gym membership: To help the user get fit and feel good. Knowing that, you could ask “Is there some untapped approach to home fitness that we could create?”, thus going beyond current conventional options.)

To go beyond the local maximum, you need to find and confront some unexamined problem in the current paradigm:

  1. To find an opportunity to innovate, consider the recurring problems with technologies you use. For example, a medieval archer might’ve noticed that human-powered shortbows couldn’t reach distant targets.
  2. Then confront that problem, asking how things could be better. Maybe the archer asked, “What could propel these arrows more forcefully than human muscle?” Now he’s begun to invent the crossbow, and he did it by looking for an opportunity to change things at a deeper level.

(Shortform note: Jordan Hall, public intellectual and co-founder of DivX, terms these oft-unexamined assumptions in our conventional paradigms the “deep code.” For example, the modern workplace has an underlying “code” of sorts that assumes certain features—including hierarchy, centralized leadership, and hourly wages or salaries. But our inherited systems aren’t the only option. When you delve into a paradigm and make implicit assumptions explicit, you reveal the deep code. For example, Decentralized Autonomous Organizations (DAOs) are one reimagining of how humans can coordinate and work together, with flatter relationship structures, nonmonetary incentives, and decentralized participation.)

Shortform Commentary: American Airlines Case Study

To illustrate the process of learning from failure, Syed describes how the airline industry uses black boxes to record, analyze, and improve systems after crashes. We’ll describe American Airlines Flight 587, a 2001 flight that crashed soon after takeoff from John F. Kennedy International Airport.

After hitting heavy turbulence from a preceding Boeing 747 takeoff, the pilot of Flight 587 (an Airbus A300) rapidly swung the plane’s rudder back and forth in an attempt to stabilize it. This created excessive pressure that broke off the plane’s vertical stabilizer (the rear wing). The plane spun out of control, crashing into the Belle Harbor neighborhood of Queens.

Using data from the black box, crash site inspections, and eyewitness interviews, the National Transportation Safety Board conducted a thorough investigation. In the end, they determined that the pilot’s actions had stressed the vertical stabilizer well beyond its limits.

The stabilizer had handled pressure that exceeded its design specifications prior to breaking—but the pilot shouldn’t have used it so recklessly. Knowing how he reacted from flight recordings, the NTSB concluded that American Airlines’ pilot training gave improper preparation for this situation.

American Airlines had used simulations that put pilots in unrealistically turbulent situations, thereby teaching them to overuse the rudder. To correct this, American Airlines has modified their pilot training program to de-emphasize using the rudder so recklessly.

Exercise: Reflect on a Recent Failure

Reflecting on our own mistakes can help us to course-correct and better understand how to achieve our goals. Here, consider a recent mistake and what you might’ve done differently.

Part 3: Neglecting Failure Is Risky

So far, we’ve looked at how we develop through failure and examined the institutions that learn from their errors. Now, we’ll discuss the institutions that neglect to learn from failure.

In contrast to the learning-oriented cultures and systems in Part 2, failure-averse institutions have cultures and systems that prevent them from learning. (Shortform note: While Syed described these as “closed loop” systems, we describe them as “failure-averse” institutions to reduce ambiguity. For example, a “closed loop” is often a good thing, like a task or project completed, and feedback processed.)

In Part 3, we’ll explore these two barriers—cultural and systemic—and we’ll illustrate the problem using Syed’s description of failure-aversion in the US health care system.

Element #1: The Cultural Barrier

Syed explains that the cultures of failure-averse institutions stigmatize failure. In other words, they treat error as a terrible thing. In these cultures, you should feel deeply ashamed of mistakes—and you’ll be judged harshly by your peers.

This is because they’re cultures of expertise: The employees are well-educated professionals—such as doctors, nurses, lawyers, and judges—who believe that experts don’t make mistakes. As a result, any error calls your professional worth into question on two levels: the personal, and the social.

On the personal level, mistakes threaten your self-esteem. Say you’ve built your identity around your skill as a doctor, and then a surgery patient dies on your watch. Acknowledging this as your failure could decimate your identity and burden you with guilt.

(Shortform note: In Tribal Leadership, Dave Logan explains that this issue is common in a Stage 3 culture, where individuals often exhibit self-focused behaviors. Doctors, he says, often exemplify the Stage 3 “shark”—an accomplished, top-of-the-class individual who’s used to pursuing their own success. These individuals use ego-focused language (“I, me, my”), which helps to explain why they’ll cover up their mistakes: When everything revolves around him, the doctor unthinkingly puts patient concerns second.)

On the social level, mistakes damage your reputation. Since you’re working in a culture of expertise, any error you commit calls your competency into question. Make too large of a mistake, and you may lose all credibility overnight. For example, the above doctor might become known for his failure to save that patient in surgery.

(Shortform note: Individuals at Tribal Leadership’s Stage 3 culture also exhibit a particular relationship style that precludes collaborating to learn from failure. Dave Logan explains that they tend to view colleagues as less competent than them, and they are more interested in controlling the flow of information through one-on-one relationships. Because knowledge is power in a competitive environment, this helps them maintain their image and keep others from knowing too much about them. Unfortunately, this reputation management can come at the expense of the patient.)

Syed explains how these cultures believe that to reduce mistakes, you need to be tough on the person who committed them. In other words, proper punishment should enhance performance. But in reality, unjust punishment causes a number of other problems, from scapegoating to cover-ups.

We Instinctively Scapegoat

When something goes wrong, we instinctively scapegoat others. We find someone to throw under the bus—usually someone close to the failure—without properly examining the situation.

The problem is always more complicated than it seems. But scapegoating oversimplifies the story and exacerbates future problems by creating a climate of fear. In organizations, nobody wants to be the scapegoat—they could lose their reputation and their livelihood. So mistakes go unreported, unexamined, and unlearned from.

For example, the No Child Left Behind Act, an American educational policy signed into law by President George W. Bush in 2002, failed to achieve its goals on multiple counts. Casual observers might scapegoat Bush, saying that his policy forced teachers to teach to the test and crushed student creativity.

But the situation was more complicated: Policy makers tried to satisfy multiple parties—from civil rights groups to businesses to educators—and aimed to make American schools competitive in a globalizing world. But funding goals weren’t met; many states and districts ignored or partially implemented the policies; and legislative updates floundered in Congress. Knowing all this, it’s difficult to point to a single reason the act failed.

Cancel Culture Is Mass-Scale Scapegoating

Cancel culture is the recent internet phenomenon wherein individuals deemed problematic by public opinion are “canceled,” or boycotted and removed from their online platforms. While some see this as a positive force for removing genuinely harmful individuals from power, others call it censorship.

In light of Syed’s description of scapegoating, we can see that cancel culture can function as large-scale scapegoating: We throw someone under the bus without examining the full complexity of the situation.

Further, well-targeted cancellations may remove genuinely harmful individuals from power, like former producer Harvey Weinstein, but they don’t address the systemic problem. Creating effective discourse is one key step toward systemic reform, and the “Rule Omega” can help prevent scapegoating in discourse: In short, any perspective contains some signal (meaningful information) and some noise (meaningless information). When you’re listening to another person speak, withhold judgment when you disagree with them and aim to find the signal—whichever point feels meaningful to you. By finding that common ground, you can empathize with one another and engage in constructive discourse, even if you mostly disagree.

We Oversimplify the Story

Syed explains that when scapegoating causes us to gloss over the full complexity of mistakes, the “narrative fallacy” is at work. In short, we make sense of complex situations by telling simple, appealing stories. Continuing the above example on No Child Left Behind, observers might assume, “Big government wanted to crush student diversity!”

Simple stories help us feel like we understand, but they’re usually wrong. We emphasize the major “story beats,” like the signing of the act and the negative impact of standardized testing, but we discount the details. Because of this, we fail to grapple with the full complexity of the failure.

For example, it’s much easier to say “No Child Left Behind was callous and poorly thought-out,” than to confront the complexity of the situation. Maybe law makers implemented ineffective policies but genuinely meant well. Maybe they acted on the best pedagogical knowledge at the time, with as much money as they could secure, and yet couldn’t get states to follow the policies faithfully enough for them to work.

(Shortform note: In Antifragile, Nassim Nicholas Taleb explains that thought leaders often engage (knowingly or unknowingly) in “narrative discipline,” or the process of creating stories that sound great and seem to fit the data, but which are false or incomplete. This involves cherry-picking data from statistical studies to fit the narrative, since statistics can seem objective and convincing without contextualization. To avoid this, Taleb suggests running controlled experiments, which can’t be cherry-picked as easily.)

We Commit Cognitive Distortions

When a story turns out to be more complex than we thought, cognitive dissonance—unconscious mental distortion of the facts—comes into play.

As Syed explains, nobody likes to be wrong—it’s a threat to our egos. So to defend our versions of events, our brains distort the information to conform to our beliefs. And the more invested we are, the worse the distortions.

(Shortform note: The notion of cognitive dissonance originates from a 1957 theory described by Leon Festinger, who claimed that we have an innate drive to maintain harmony between our attitudes (thoughts, beliefs) and behaviors. When we face any decision, every option has pros and cons, so we’ll always lose out on something—for example, accepting a job with great pay but a boring office in an average town. This causes dissonance, which we tend to overcome by rationalizing: “It isn’t so bad, really.” While this may seem irrational, it creates cognitive consistency, which is rational in that it reduces the anxiety of dissonance and helps you get on with your life.)

Syed explains that we’re all prone to three types of cognitive dissonance. We’ll illustrate each type with examples of real-world incidents surrounding Chogyam Trungpa Rinpoche, the spiritual leader of Shambhala International (a Buddhist organization) who faced controversy related to substance use and sexual abuses:

  1. We reframe the evidence. In other words, we accept a piece of evidence but tell a story to conform it to our beliefs. For example, Rinpoche’s accusers cited stories of sexual abuse from ex-devotees, but devout followers answered that his sexual abuses were spiritual lessons, and his victims just weren’t ready to learn.
  2. We invent justifications. Here, we accept the evidence but find some way to excuse it. So a devout follower might justify Rinpoche’s “seven spiritual wives” by saying that he earned it with his spiritual dedication.
  3. We ignore the facts outright. Sometimes, cognitive dissonance causes point-blank denial of the facts. After Rinpoche died of liver poisoning from his drinking problem, followers propped his body into a sitting meditation position and proclaimed that he’d reached parinirvana (enlightenment after death), as if his failings had never happened.

How to Combat Cognitive Dissonance

Though cognitive dissonance often occurs automatically, we can learn to recognize it. According to psychologists, there are various signs of cognitive dissonance:

The best way to resolve cognitive dissonance is to honestly examine and adjust your beliefs. While this is difficult, our only other options are to rationalize or ignore the conflicting information, which doesn’t help. To examine your beliefs, try using mindfulness meditation to develop space between your reception and response to new information. This allows you to observe and change your actions, instead of simply reacting from habit.

Element #2: The Systemic Barrier

In addition to the above cultural barriers, failure-averse organizations also lack systems that enable them to learn from failure. Unlike learning-oriented organizations, they don’t gather data, conduct investigations, analyze what went wrong, or implement changes. Often, they don’t even record their mistakes.

No Feedback Means No Improvement

In short, these organizations lack a learning mechanism. Specifically, Syed explains that they lack a feedback loop that enables information to flow from failure to analysis to adjustment to implementation.

An individual or organization can’t improve without feedback, since feedback tells you what’s going wrong. If you don’t know where things aren’t working, you can’t improve.

For example, raising children involves ambiguous feedback that makes it difficult to know how you’re doing. Since there’s no clear and obvious benchmark for what a “good” job raising a child is, and there’s no clear and obvious feedback, it’s difficult to improve on a measurable scale.

(Shortform note: In The Art of Learning, Josh Waitzkin advocates for a particular kind of feedback: examining the theme of your errors. In short, gather data by noting the mistakes you repeatedly make. Then, examine your notes and identify any patterns. Look for both the emotional and technical aspects of your error, and work to correct them. For example, a parent who gets impatient with their toddler might work to change their mindset while also studying better childcare techniques. To maintain the feedback loop, continue to note and correct your patterns of error.)

By contrast, if you’re learning piano, it’s obvious when you play the wrong notes. With clear and immediate feedback, you can effectively learn.

In professions that lack readily available feedback, professionals do their best with the skills they’ve developed, but they often fail to improve over time. Emergency surgeons, for example, don’t gather specific data about how well they set a broken bone, implanted metal pins, or whether they make the right choices under pressure. To do so would require setting up mechanisms, like regular patient check-ins, to gather feedback over the long term.

(Shortform note: In The Bullet Journal Method, Ryder Carroll suggests using a journal to conduct daily reviews. In your morning review, look back over previous journal entries to get a sense for where you were and where you are now. In the nightly review, look over the day’s tasks and ideas to determine how well you’ve done. This tangible feedback gives you direct information about how to improve—for example, you might learn that setting 11 tasks is too many, so you adjust your to-do list until you learn what you can handle.)

No Feedback Means the System Stagnates

Syed argues that the absence of a feedback mechanism is a systemic problem, not a cultural one. The employees are intelligent, capable professionals, but workers can only perform as well as the system enables them to. If an organization’s systems aren’t designed to learn from failures, they simply won’t get better—and neither will the employees.

Imagine a hospital that neglects to optimize its triage system for the emergency room’s busiest night. They know they’ll be overburdened, yet make no adjustments, allowing employees to continue struggling.

Compare this to a hospital that had faced the same problem, but took the opportunity to learn and implement changes. By learning from their overburdened ER, they remove the chance to repeat past mistakes. For example, say the problem was a disorganized waiting space. To fix this, the hospital restructures seating areas for each type of patient emergency.

Now staff can find patients at a glance—the system itself ensures better performance. Put another way, you can’t make mistakes that the system has already corrected for.

(Shortform note: In his 3-2-1 newsletter, James Clear explains that we don’t “rise to the level of our goals,” we “fall to the level of our systems.” He defines your personal system as the network of habits you’ve built, and we can extend this to organizations: An organization’s system is the network of habits and habit-enabling tools (like error reporting mechanisms) that all its people engage in. If part of the system is to report errors and openly admit fault, any new employees will default to that behavior, “falling” to the level of the system they’ve stepped into.)

Believing in False Dichotomies Prevents Change

Syed argues that failure-averse institutions believe in a false dichotomy that prevents improvement: either we do A and lose out on B, or do B and lose out on A. But in fact, it’s often possible to improve both:

(Shortform note: The above situations demonstrate the false dilemma fallacy, wherein reasoning from an either-or mindset obscures other possibilities. But there are nearly always other possibilities. In contrast to either-or thinking, look for opportunities to see when both-and is true. For example, rationality and empiricism each have unique strengths, so it’s better to think “both-and,” rather than fight for one or the other.)

Case Study: Failure Aversion in the Health Care System

Syed argues that in the health care system, hospitals have numerous opportunities to learn from failure but often disregard them. This is because they operate according to top-down systems that go unquestioned, and they have cultures that stigmatize failure.

Problem #1: Health Care Cultures Avoid Acknowledging Mistakes

Doctors, nurses, and other health care professionals train for years, so they expect perfection from themselves and their peers. Mistakes carry a strong stigma—if you mess up as a surgeon or nurse, you’ll be looked down upon—and the more senior your role, the stronger the blame.

Because of this, Syed argues, many health care workers fear reporting mistakes. Reporting their own errors leads to consequences, and reporting their superiors’ might provoke retaliation. As we explained above, cognitive dissonance compounds the problem: A doctor can unwittingly suppress his memory of a failure, justify the mistakes, or outright deny that it happened. By doing so, he preserves his reputation at the expense of continued patient harm.

(Shortform note: In a January 2022 article, emergency room nurse Sally Ersun details her gut-wrenching last day on the job, highlighting the systemic issues Syed refers to. Understaffed and overburdened, her department had too few supplies and could not provide blood for a dying man; another patient was left unattended and fell and hit her head—and there was no time to report the incident. Ersun explains that she’s been “threatened” by superiors, and ultimately quit due to physical and emotional exhaustion. A nurse for 10 years, she argues that health care’s for-profit model “prioritizes finances over lives,” which stoutly corroborates Syed’s analysis of health care.)

Problem #2: Health Care Systems Don’t Analyze Failures

Many hospitals also lack systems for reporting, investigating, and improving upon errors. Syed cites a report showing that fewer than 20 US states require error reporting mechanisms in hospitals. Of those 20, few consistently investigate errors and enforce changes. Another study found that of 273 hospitalizations, hospitals “missed or ignored” 93% of preventable errors.

In health care, investigating errors simply isn’t the convention. Because of this, numerous mistakes—including hundreds of thousands of preventable deaths annually—go unexamined. Many deaths from surgery, medication error, and neglect are written off as “inevitable” or “one-off” tragedies.

(Shortform note: One key to effective investigations is to work with an independent investigator. In 2016, a National Health Service investigation demonstrated the need for this, showing that hospitals consistently treated family members of the deceased with little courtesy and often ignored or blocked their requests for information. The Care Quality Commission (CQC), a care watchdog based in England, is attempting to establish an independent review process to enforce accountability. Their first goal is to secure better treatment for the families, which the CQC determined hospital workers view as “antagonistic.”)

Since they don’t learn from their mistakes, hospitals also lack “institutional memory,” a shared compendium of lessons learned. Without this, the few lessons learned take years, even decades, to percolate through the broader health care system.

The infrequency of autopsies exemplifies the problem, according to Syed:

(Shortform note: As recently as the 1950s, US hospitals performed autopsies on around 50% of all deaths. Virtually all medical experts agree that autopsies are invaluable for determining the cause of a death, yet they’re expensive, time-consuming, and must be performed at the hospital’s expense. Modern physicians argue that advanced imaging technologies and budgetary concerns render autopsies unnecessary, though one study of autopsies performed to confirm clinical diagnoses found a median error rate of 23.5%, showing that they remain relevant for determining cause of death and for learning from what happened.)

Exercise: Practice Engaging With Complexity

One major barrier to learning from failure is oversimplifying the story. When we scapegoat, simplify, and distort the facts, we obscure what’s really going on and prevent learning. To overcome this, practice engaging with complexity.

Part 4: Why and How to Embrace Failure

Having discussed the power of learning from failure—and the risks of neglecting to do so—we’ll now look at why and how to create organizations that embrace and learn from failure.

In Part 4, we’ll first explain why learning from failure is morally imperative. Next, we’ll unpack Syed’s redefinition of failure as positive, and we’ll explain his suggestions for changing an organization’s culture and systems to learn from failure.

(Shortform note: Tribal Leadership, by Dave Logan et al., explores how to change organizational cultures in more detail. Logan argues that we should view an organization as a collection of tribes and strive to level up the culture of each tribe. The goal is to reach a “Stage 4” tribal culture, where individuals learn from failure, work together, and achieve great things. Throughout this section, we’ll supplement Syed’s work with ideas from Tribal Leadership.)

Failure-Averse Institutions Waste Lives and Resources

In Part 3, we looked in detail at how institutions neglect to learn from failure—but why does this matter?

In short, Syed argues that the institutional failure to improve is morally irresponsible. A stagnant institution is a missed opportunity to develop better systems that support life and improve our world. Each time a learning opportunity goes unexploited, progress slows.

(Shortform note: Steven Pinker, author of Enlightenment Now argues that society has advanced in leaps and bounds since the European Enlightenment of the late 17th and 18th centuries. However, Pinker overlooks Jean-Jacques Rousseau’s argument that if civilization progresses solely through reason, our morals will degrade. In his Discourse on the Sciences and Arts, Rousseau points to ancient Athens, where the flourishing sciences and arts led to a society focused on leisure, luxury, and decadence. Advanced scientific knowledge, he says, isn’t intrinsically moral, so if we neglect virtue in favor of scientific progress, we’ll end up with a materially advanced but morally defunct society.)

Flaws in our courts and hospitals also have life-and-death consequences:

Failing isn’t the problem in and of itself. Rather, the problem is neglecting to learn from those failures. When organizations fail to report and learn from their mistakes, they perpetuate patterns of failure that put lives at risk.

In contrast to courts and hospitals, the airline industry has learned from the hundreds of lives lost in historical crashes. They analyze the failures, extract takeaways, and implement changes. Because of this, flying is now incredibly safe.

(Shortform note: In The Art of Learning, tai chi world champion Josh Waitkzin argues that we improve only as quickly as we learn from our mistakes. He suggests that if you never repeated a mistake—learning from it after one instance—you’d grow incredibly quickly. While this is difficult in practice, Waitzkin advises that setting aside your ego can help. If we’re too focused on appearing competent or we’re embarrassed to admit mistakes, we’ll never learn.)

We’ve progressed in traditionally scientific and technological areas, but modern society hasn’t thoroughly implemented rational, trial-and-error adaptation in the domains of social science—such as politics, education, law, and health care. In these domains, Syed says, we still operate from top-down paradigms that prevent improvement. For example, an economist might assert his theory as truth without testing it—but a molecular biologist wouldn’t do so without hard data to back up her claims.

(Shortform note: While Syed asserts that reason always drives progress, Daniel Schmachtenberger of the Neurohacker Collective says that our technological growth has outpaced our ability to act with wisdom and compassion. When we have technology that enables us to destroy the world (for example, nuclear weaponry), then we must become wise and sane enough to handle that technology. Soryu Forall, teacher at the Monastic Academy for the Preservation of Life on Earth, agrees: He argues that power without love and wisdom creates a world of confusion and, ultimately, self-destruction. As a solution, he suggests pursuing power, love, and wisdom through mindfulness practice and community building.)

Syed argues that the discrepancy between industry types is absurd because the social sciences are more complex than the hard sciences. For example, it’s much easier to model physical phenomena, like how a bridge will hold weight, than to model social phenomena, like how children learn and develop. Therefore, recognizing knowledge gaps and learning from failure is even more important in the social sciences.

Furthermore, Syed asserts that the top-down approach can no longer keep up with our changing world. For example, the proliferation of complex online social networks defies traditional top-down modeling, and understanding them requires swift adaptation. As long as we fail to do this, we’ll fail to understand our world and risk lives, resources, and potential.

(Shortform note: Syed seems to agree with the postmodernist perspective that a “metanarrative,” or complete integration of the disparate stories in some domain, is impossible. In other words, there’s no real way to create a top-down model of, say, American society, that fully coheres the myriad of perspectives, cultures, and ideas that compose the tapestry of America. So instead of pursuing totalizing models of knowledge, aim to develop practical understandings that draw from your own experience, learning to navigate what’s right in front of you instead of trying to understand the entirety of the complex, overlapping systems we live within.)

View Failure as Beneficial

According to Syed, creating individuals and organizations that learn from their mistakes starts with changing how we think about failure. By reframing failure as positive and beneficial, we can overcome the fear of erring and start learning from mistakes in our lives and our organizations.

To explain why changing your view of failure is the key, Syed refers to psychologist Carol Dweck’s ideas of “growth mindset” versus “fixed mindset”:

(Shortform note: Whether you have a growth or fixed mindset isn’t all-or-nothing, and it can vary from domain to domain. For example, someone who becomes physically fit through trial-and-error might believe that the same approach wouldn’t work with mathematics or science. In this way, we often kneecap our growth in some areas of life even while growing in others. To avoid this, try taking your growth mindset from a strength of yours and applying it to weaker areas of your life—such as extending your skill in business thinking to emotional intelligence, or vice versa. Both are skills to be grown with the same underlying method: by learning from errors.)

Syed argues that having a fixed mindset correlates with fragile self-esteem. Because failure could reveal your inadequacies, you’ll avoid it for fear of looking foolish or incapable. Regarding failure, people with a fixed mindset are threatened by failure and do everything they can to avoid it because they see it as a reflection of their self-worth.

In contrast, having a growth mindset means you understand that errors are intrinsic to learning. So you’ll regard difficulties, like a challenging work project or struggling relationship, as chances to learn. Regarding failure, people with a growth mindset believe failure is to be expected, and it’s a tool.

(Shortform note: In Grit, Angela Duckworth argues that innate talent doesn’t predict success. Instead, “grit”—resilience and direction—better determines who sticks it out until they reach success, and who drops out early. Like a growth mindset, grit depends on the belief that we can learn from our mistakes. You can develop grit by identifying your natural interests, such as community building or permaculture, and then practicing them consistently, through all the obstacles that you’ll face.)

Syed cites research that measured mindset differences in terms of two brain signals. The “Error Related Negativity” signal (ERN) is the brain’s default reaction to a mistake, while the “Error Positivity” signal (Pe) indicates conscious processing of the mistake.

(Shortform note: A 2018 research paper suggests that intrinsic motivation is one key to the neuroscience of growth mindset. This is because individuals with growth mindsets act not to gain external rewards, but from an enjoyment of the learning process itself. In other words, intrinsic motivation drives their behavior. Further, the paper suggests that healthy dopamine functioning is the key neurological factor in intrinsic motivation, thus in growth mindset. When you experience a task as rewarding in and of itself, it means that your brain has associated a dopamine reward pathway with that activity. Growth mindset, then, might come from the coupling of dopamine reward pathways with the learning experience itself.)

Mindset Determines Organizational Culture

As with individuals, an organization can exhibit a growth mindset or a fixed mindset, and that mindset shapes its culture. Crucially, an organization’s systems can’t change until its culture does. Since mindset shapes culture, we first need to change our mindsets.

(Shortform note: In Tribal Leadership, Dave Logan argues that the best way to upgrade an organization is by upgrading its tribes. Each tribe within an organization exhibits a cultural stage, which is composed of particular patterns of language and behavior. The lower the stage, the less effective the tribe. To upgrade your organization, then, is to upgrade your tribe’s cultures. At a higher stage, the mindset is essentially, “work is better when we work together,” a teams-focused attitude that complements Syed’s focus on establishing an organizational growth mindset.)

Syed cites a study that surveyed employees at seven Fortune 1000 companies to determine the general mindset of each organization, and then compared the cultures of the fixed- versus growth-oriented companies.

Culture #1: At fixed mindset companies, employees reported a culture of fear. Workers feared retribution for any error, and they would often conceal failures to avoid blame. They noted the same about their colleagues, reporting that people often hid their mistakes, took shortcuts, and cheated to get ahead.

Because of this cultural style, the organizations couldn’t learn from their mistakes. As long as blame and shame followed mistakes, employees felt unable to take creative risks, innovate, and improve their companies.

(Shortform note: Dave Logan notes in Tribal Leadership that a certain kind of boss creates a workplace dominated by fear, apathy, and passive-aggression. Called “evil bosses” by Dilbert creator Scott Adams, they tend to treat lower-level workers as expendable and not worthy of respect or care. Because of this, those workers become disillusioned with work and they can’t be bothered to make productive, focused efforts. Overcoming this involves helping both the manager and the employees improve their language habits and relationship styles, shifting from a “life sucks” mindset to a “we’re great” mindset that promotes teamwork and respectful collaboration.)

Culture #2: At growth mindset companies, employees reported an open, honest, and collaborative culture. They trusted that the company would support them if they made mistakes, and they reported that higher-ups saw reasonable risks as “value added,” or as learning opportunities. In short, these companies embraced failure’s role in creativity and learning.

Further, Syed argues that these positive behaviors indicate that the growth mindset companies were well-positioned to adapt and succeed.

(Shortform note: Effective companies, Dave Logan argues, exist at a Stage 4 tribal culture. At Stage 4, a tribe aligns around their core values and a collectively crafted vision. They also build triadic relationships—three people who all support one another. The values and vision give the team a clear sense of how to make decisions: If a choice would infringe on those values, you wouldn’t make it. The triadic relationships create a stable social and work network that maintains itself—when a conflict arises, the third person can often help to mediate, and all three people have shared values to lean on in tough cases. Altogether, this allows the team to focus on valuable, innovative work.)

Handle Mistakes Fairly

To change an organization’s culture from a fixed mindset to a growth mindset, you need to handle mistakes fairly. This creates a culture where employees can trust the higher-ups—the people who decide what happens when mistakes occur.

Handling mistakes fairly prevents an environment of fear—as we explained in Part 3, scapegoating causes fear and incentivizes employees to conceal their mistakes. Since this prevents both genuine errors and real negligence from coming to light, the first step is to stop scapegoating.

When scapegoating stops and mistakes can come to light, treat them all justly. No matter the mistake, fair leadership conducts a thorough analysis of the error and doles out the appropriate consequence. If the mistake turns out to be genuine, like emailing a report to the wrong supervisor, the employee can learn from it. If the mistake turns out to be negligence, like a surgeon coming into work on cocaine, leadership should mete out an appropriately harsh punishment.

When leadership responsibly investigates errors, employees feel that they can make honest mistakes without fear of retribution. At the same time, punishing real negligence ensures that workers remain diligent and honest, since they know leadership will justly punish unethical behavior.

Make Difficult Decisions That Serve the Tribe

Oftentimes, leadership must make difficult decisions that aren’t great for every individual involved. Layoffs, budget cuts, and project cancellations often raise hackles, but leaders must make these sorts of decisions.

In Tribal Leadership, Dave Logan and coauthors suggest that a great leader must understand how to put the tribe first under these difficult circumstances. Acting from the tribe’s core values and shared vision, she might need to put big egos in their place, part ways with employees who no longer align with those values, or shut down stagnating projects.

The key is to listen thoroughly to the tribe, much like Syed suggests. By paying attention to the range of perspectives people bring to the table, and noticing whether they’re acting in the tribe’s best interests or their own, a leader learns to “sniff out” self-interested individuals who could threaten the tribe’s success. Acting in this way, she maintains the tribe’s trust by showing that she has its best interest at heart, even when she needs to make tough choices.

Create a Learning System

After changing an organization’s culture, the next step is to change its systems. (Shortform note: Syed offers loose ideas more than practical wisdom. To supplement his thinking, we’ve interspersed commentary from The Fifth Discipline by Peter Senge, which discusses learning organizations.) Syed offers the following four ideas:

1. Create better feedback mechanisms: As we explained in Part 3, feedback enables learning—you can’t see your mistakes without it. Since many organizations lack a built-in error signal, a key step is to create a mechanism to capture that feedback. For example, local governments could hold weekly feedback sessions with community members to learn how they’re doing on a week-to-week basis.

The tighter the coupling between action and feedback, the faster you can learn. This is because more signals means more to learn from. For example, a mayor who coasts through his term won’t learn much—his lack of action precludes the gathering of feedback, and so prevents learning.

(Shortform note: An easy way to gather feedback is to record video of what you’re doing. Josh Waitzkin writes in The Art of Learning that he recorded his tai chi practice sessions to identify mistakes and refine his form. Similarly, imagine recording video of a community meeting, then holding a retrospective with key players to analyze how things went and what you could do better. That way, you can tangibly identify mistakes—for example, maybe your dialogues are dominated by the most extroverted community members—and fix them. In this case, you might explore new conversational modalities that hold space for quieter individuals to express their perspectives.)

2. Prototype adaptive systems: To select for the best learning systems, Syed suggests creating numerous systems and pitting them against each other. This system of systems enables the best individual systems to rise to the top.

Syed describes that the market system mimics natural evolution: Individual businesses compete against each other, and the best survive while the rest “die” (go bankrupt). Those that survive best fit the demands of the modern world. Today, business fitness means agility—and in fact, lean startups (à la the MVP from Part 2) outcompete companies using slower, older ways of doing business. So their style of system is evolutionarily superior in the market environment.

(Shortform note: In 1968, geneticist Kimura Motō proposed his “neutral theory” of gene variation, arguing that biodiversity comes not just from “survival of the fittest,” but also from luck. Chance events can kill off some genes while others survive, such that luck partially determines which variants exist today. Similarly, random chance plays a role in business competition: While “fitness” (such as business acumen, agility, and experience) plays a role, any business can experience significant random events. For example, a key leader could come down sick during an important deal. Or a founder could reach out to a potential investor on a great day, catching that investor in a good mood that helps them work out a great collaboration—largely by chance.)

3. Use pilot programs: In politics, Syed suggests running pilot programs—small-scale tests of policy changes or business ideas that allow you to get feedback safely and inexpensively. Be sure to set up pilots that mimic real-world conditions. By staffing them with a mix of experienced and inexperienced employees, you’ll get a sense for how an actual team would perform under your program.. Avoid setting pilots up for success by staffing them with only your best people, tools, and facilities. If you guarantee success, you won’t learn anything.

(Shortform note: In 2019, the US Congress passed the Foundations for Evidence-Based Policymaking Act, which stipulates that lawmakers and federal agencies must begin developing statistical evidence to support policies. While evidence-based pilots were once thought to be too expensive or time-consuming, the rise of agile methods in software engineering and in the broader business world has shifted that attitude. Now, numerous federal agencies, from the FDA to the Department of Labor, have begun running pilots, much like Syed suggests.)

4. Use “pre-mortems” to strategize: Before launching projects, Syed suggests running a “pre-mortem,” which improves outcomes by helping you discover the flaws in a plan before implementing it.

To perform a pre-mortem, ask your team to imagine that the plan has failed completely—your campaign crashed and burned, your product flopped, and so on. Then, work backward by asking what went wrong. Typically, discussing these failure modes enables the team to improve its plan and better avoid failure—in fact, research holds that this predicts the reasons for failure versus success with 30% greater accuracy.

(Shortform note: In The Obstacle Is The Way, Ryan Holiday suggests that pre-mortems are also a useful tool for personal growth. Hardships, he says, are inevitable. By expecting that bad things will happen sooner or later, we better prepare ourselves for them. And by actively anticipating what could go wrong (such as with your workday, a date, or a difficult conversation) you improve your chances of success. Specifically, he suggests thinking up as many possibilities for failure as you can, and then creating a contingency plan for each.)

Case Study: Virginia Mason Becomes a Top-Ranked Hospital

In Part 3, we explained that many hospitals lack the right cultures and systems to learn from failure. But Syed points out that failure-averse organizations can change, as demonstrated by the Virginia Mason hospital in Seattle, Washington.

Like many other hospitals, Virginia Mason had cultural and systemic flaws that caused undue patient harm. When Gary Kaplan took over as CEO, he decided to implement an error reporting process, reasoning that it would reduce mistakes.

But the new process fell short—Kaplan had tried to change the system before addressing the culture. Virginia Mason’s culture stigmatized failure, so few employees used the new reporting system. No one wanted to expose their own mistakes, and junior workers didn’t dare report their seniors.

(Shortform note: Dave Logan explains in Tribal Leadership that when employees exist at a low-stage tribal culture—similar to that at Virginia Mason—they’re often unmotivated by top-down change. Typically, they view it as a ham-fisted attempt to change “the way things are,” or as management intruding upon their norms. The solution is to change the culture: Learn to recognize your employees’ cultural stage, and then listen, empathize, and build relationships with them and between them. Gradually, you’ll help them feel valued, and in this way you can lift the organization up from the bottom.)

To overcome the rejection of his reporting system, Kaplan needed to change the culture. When a woman died in surgery from an avoidable mistake—she received a toxic substance that was mistakenly left next to her medicine in an identical container—Kaplan owned up to the failure. He issued a full explanation and made amends with the woman’s family.

In leading by example, Kaplan demonstrated to employees that they could own up to mistakes without fear of retribution. This catalyzed a cultural shift in the hospital, and employees began using the error reporting system.

(Shortform note: Logan also explains that the tribe follows the leader, as demonstrated above by Kaplan. The more a tribal leader builds strong relationships and improves trust between him and his team, the more they’ll naturally follow his lead. This is because the tribe respects and values a leader who engages with them as people, not just as workers. Much like Virginia Mason, executives at Griffin Hospital in Connecticut lead their organization to success by catalyzing the culture to improve their systems.)

Reports began to reveal flaws in Virginia Mason’s prescription system and the wristband system used to categorize patients. Now knowing where the errors lay, Kaplan’s team developed and implemented simple changes—reducing illegible prescriptions and adding descriptive text to patient wristbands to distinguish their issues.

Working with thousands of error reports, Virginia Mason gained the data to improve errors, and it now ranks as a top US hospital. Along with reduced insurance liability, it has received awards for clinical excellence and providing an outstanding patient experience.

(Shortform note: An organization’s culture must remain strong for it to remain effective. Executives at Griffin Hospital also made systemic changes, using surveys and focus groups to gather data, yet they never ceased to fully involve the lower-level staff. One powerful method they used was to directly engage the staff about their own vision for the hospital’s success. In asking employees—not just telling them—they created a culture that was invested in their shared vision.)

Exercise: Practice Viewing Failure as Positive

When we view failure negatively, it hinders our ability to learn. Practice seeing your mistakes in a positive light.