Ever wonder why some people just keep hitting the same wall over and over? It’s frustrating. You see it in offices, in relationships, and definitely in high-stakes industries like aviation or medicine. Matthew Syed’s book, Black Box Thinking, basically rips the lid off why we’re so bad at failing. Honestly, most of us are programmed to hide our screw-ups because they feel like a bruise to the ego. But Syed argues that if we don't treat our failures like a flight data recorder—a "black box"—we’re doomed to repeat the same disaster.
Failure sucks. Nobody likes it. But there is a massive difference between a "closed loop" and an "open loop" when things go wrong.
In a closed loop, failure doesn't lead to progress because the information is filtered out or ignored. In an open loop, the failure is actually the fuel for the next success. Syed uses the contrast between the airline industry and the healthcare system to show exactly how this plays out in the real world. One saves lives by obsessing over errors; the other, historically, has struggled because of a culture of "perfect" expertise that doesn't allow for doubt.
The Massive Gap Between Aviation and Healthcare
Think about the last time you flew. You probably didn't think twice about the plane crashing. That’s because aviation has a legendary safety record. Why? Because they treat every single mistake as a data point. When a plane goes down, or even when there's a "near miss," the industry doesn't just blame the pilot and move on. They find the black box. They analyze the flight data. They change the entire industry’s protocols based on that one error. It’s a systemic approach to learning.
Compare that to the story of Elaine Bromiley, which Syed details early in Black Box Thinking.
Elaine went in for a routine sinus surgery. It should have been simple. But things went sideways with the anesthesia. Her airway collapsed. Instead of following a set protocol for a "can't intubate, can't ventilate" scenario, the doctors—who were highly skilled and well-meaning—succumbed to "fixation error." They kept trying the same failed tactic until it was too late. Elaine ended up brain dead and died days later.
The kicker? The nurses in the room actually knew what was happening. They brought the emergency equipment to the bedside. But the hierarchy was so rigid that they didn't feel they could challenge the senior surgeons. That is a closed loop. The mistake was buried under the guise of "complications" rather than being dissected to prevent the next one. This isn't about bad people; it's about a bad system that protects the ego over the patient.
✨ Don't miss: Trump Meme Coin: Why Most People Get It Totally Wrong
Why Your Brain Hates Being Wrong
It’s called cognitive dissonance.
When we're confronted with evidence that we made a mistake, it creates mental discomfort. To get rid of that feeling, we often ignore the evidence or spin a story to justify our actions. We’ve all done it. "It wasn't my fault, the market shifted," or "They didn't give me the right tools."
Syed points out that the more "expert" we become, the harder it is to admit failure. This is the paradox of professional success. If you’ve spent twenty years becoming a top-tier surgeon or CEO, admitting a basic error feels like your whole identity is under attack. So, you rationalize. You ignore the "black box" data in your own life.
The Power of Marginal Gains
You’ve probably heard of David Brailsford. He took over Team Sky (the British professional cycling team) and became obsessed with "marginal gains." The idea is simple: if you improve every single element of a process by just 1%, the cumulative effect is huge.
They didn't just look at better bikes. They looked at:
- The pillows that gave riders the best sleep.
- The most effective way to wash hands to avoid infection.
- The massage gels that recovered muscles fastest.
This is Black Box Thinking in action. It’s the willingness to look at tiny, boring details and test them. If something didn't work, they ditched it. They didn't care about "tradition" or "how it's always been done." They cared about what the data said.
Complexity and the Narrative Fallacy
The world is messy. Most of the time, we try to simplify it into neat stories where A led to B. But Syed argues that in complex systems, we can't predict outcomes perfectly. We have to use a process called "trial and error," but on steroids.
Take the development of the Unilever detergent nozzle.
Mathematicians tried to design the "perfect" nozzle using complex fluid dynamics. It failed. It kept clogging. So, what did the scientists do? They took a nozzle that was "okay" and made ten slightly different versions of it. They tested them all. They took the best one of those ten and made ten more variations. After 45 generations of this, they had a nozzle that was incredibly efficient—even though they couldn't mathematically explain why it worked better than the original.
This flies in the face of the "Great Man" theory of history. We like to think geniuses sit in a room and have a "Eureka" moment. Usually, they just failed more times than you, faster than you, and actually kept track of why they failed.
Creating a Growth Culture
So, how do you actually apply this? It starts with psychological safety. If you’re a leader and you punish people for making mistakes, they will start hiding them. Guaranteed.
In Black Box Thinking, the emphasis is on separating "culpable negligence" from "honest mistakes." If someone is reckless, that’s one thing. But if someone makes a mistake in a complex environment, that is a learning opportunity for the whole organization.
- Ditch the Blame: When something goes wrong, ask "What happened?" instead of "Who did it?"
- Test Your Assumptions: Don't just trust your gut. Run small experiments.
- Redefine Failure: Treat it as a necessary step in the discovery process, not an indictment of your intelligence.
The Problem with "Expert" Intuition
We put a lot of weight on "gut feelings." But Syed is skeptical. Intuition is really only valuable in environments that are consistent and provide immediate feedback—like chess or firefighting. In the world of business or medicine, where feedback loops are often delayed or muffled, "expert intuition" is frequently just overconfidence.
Evidence-based practice is the only way out. This means looking at the cold, hard numbers even when they tell you that your favorite project is a dud. It's about being willing to be wrong today so you can be right tomorrow.
Moving Toward an Open Loop Life
The reality is that most of us are living in a closed loop. We make excuses. We blame the "system." We protect our egos.
Breaking out of that requires a fundamental shift in how you view yourself. You aren't your successes, and you certainly aren't your failures. You are the process of learning. If you can adopt the mindset found in Black Box Thinking, you stop seeing setbacks as disasters. They become signals.
Start small. Look at a recent project that didn't go as planned. Don't look for someone to blame. Don't tell yourself a story about why it "wasn't that bad." Actually sit down and find the data. What was the specific point where things deviated from the plan? What was the "black box" recording saying while you were busy telling yourself everything was fine?
Real-World Action Steps
- Audit your failures. Literally keep a "failure log." It sounds depressing, but it’s the only way to see patterns. If you don't write it down, your brain will "edit" the memory to make you look better.
- Encourage dissent. If you lead a team, give a "reward" for the person who finds a flaw in your plan. You need people to feel safe telling you the "plane" is heading for a mountain.
- Run Pre-mortems. Before starting a project, imagine it has already failed. Ask everyone, "Why did this fail?" This bypasses the optimism bias and helps you find the black boxes before the "crash" even happens.
- Focus on the process, not just the goal. If the process is sound and incorporates feedback, the results will eventually come. If the process is a closed loop, even a "success" is just a fluke you won't be able to repeat.
Stop hiding the black box. Open it up, look at the wreckage, and use it to build something better. It’s the only way to actually get where you’re trying to go.