|
Hello Reader, My wife Taylor hosts a podcast called Doomed to Fail with her co-host Farz. The premise: take history’s most notorious disasters and epic failures, analyze the red flags, and ask the uncomfortable questions: How did things go so wrong? Could this have been avoided? They invited me to talk about a mental model that answers both questions. It’s called second-order thinking. And I wanted to share the core ideas with you here, because this pattern is everywhere. The Scene That Explains EverythingSeptember 27, 1986. Cleveland, Ohio. 1:50 PM. A structure the size of a city block sits in Public Square, draped in mesh netting. Beneath it, 2,500 volunteers have spent hours filling balloons with helium. The goal: two million. They stopped at 1.5 million when the weather started turning. The net lifts. The balloons erupt upward, wrapping around Terminal Tower like a multicolored second skin. Over 100,000 people are watching. Cameras flash. It’s the largest balloon release in history—shattering Disneyland’s record. A world record for charity. For thirty seconds, it’s magic. Then the cold front arrives. The balloons don’t rise and disperse as planned. They collide with cool air and rain and plunge back toward earth—still inflated—clogging streets, blanketing Lake Erie, shutting down the airport runway for half an hour. And on Lake Erie, where two fishermen (Raymond Broderick and Bernard Sulzer) had capsized the day before, Coast Guard helicopters are searching. They can’t distinguish the men’s bodies from thousands of rubber balloons floating on the surface. Both bodies wash ashore days later. The United Way of Cleveland wanted to inspire a city. They planned for months. They meant well. Raymond Broderick’s wife sued for $3.2 million. They settled for an undisclosed amount. Two men are dead because of balloons. The Mental Model: Second-Order ThinkingThere’s a tool that’s supposed to prevent exactly this. It’s called second-order thinking. Before you act, ask “and then what?” Trace the consequences. Don’t just see the first move—see the second, the third, the cascade. The organizers of Balloonfest had access to this tool. They were smart. They planned for months. They built a structure the size of a city block. They coordinated 2,500 volunteers. They just never asked: What happens after the balloons go up? Not because they forgot. Because they were certain. This was for charity. This was going to save Cleveland’s reputation. What could possibly go wrong with balloons? That certainty—that’s what disarms the safety mechanism. When you know you’re doing good, you stop looking for evidence you might be wrong. Second-order thinking requires doubt. And doubt feels like disloyalty to the cause. Why Use ItGood intentions don’t prevent consequences. They accelerate them. The worst disasters in history weren’t caused by villains. They weren’t caused by negligence. They were caused by people who were sure they were helping. The Titanic wasn’t sunk by someone who didn’t care about passenger safety. It was sunk by people so confident in their engineering that they didn’t pack enough lifeboats. Chernobyl didn’t melt down because the operators were reckless. It melted down because they were running a safety test. Same pattern. Every time. The road to catastrophe isn’t paved with bad intentions. It’s paved with unexamined good ones. The Three TrapsIn the podcast, I walk through three stories that reveal three distinct ways certainty disarms second-order thinking: The Permission Trap (Balloonfest ‘86) The story they told themselves: “This is for charity. What could possibly go wrong with balloons?” The permission: Good cause equals exempt from scrutiny. First-order thinking asks: What happens when we release the balloons? Answer: They go up. People cheer. We set a record. Second-order thinking asks: And then what? Where do they come down? What else is happening that day? Who else might be affected? The organizers weren’t dumb. They were thorough. They just stopped one step too soon. The Closed System Trap (Biosphere 2) Eight people seal themselves inside a $150 million glass structure in the Arizona desert. Three acres of engineered ecosystem. The mission: prove humans can live in a closed, self-sustaining environment. They had everything modeled. Rainforest. Ocean. Savanna. Two years of complete self-sufficiency. Sixteen months in, oxygen levels had dropped from 20.9% to around 14%—equivalent to living at 15,000 feet elevation. The crew could barely function. Tank trucks started driving up the access road, pumping in liquid oxygen. The cause? The enriched soil fed bacteria that consumed oxygen faster than plants could produce it. And the concrete was absorbing CO2—locking oxygen away as calcium carbonate. The story they told themselves: “We’ve modeled it. The ecosystem will balance itself.” The permission: Expertise equals certainty. Complex systems can be engineered like machines. They treated a living ecosystem like a watch. They built a watch. They got a wilderness. And the wilderness ate their air. The Map Trap (WHO in India) 1970s. The WHO arrives in India to eradicate smallpox—a disease killing one in four who contracted it. They had the vaccine. They had the plan. Ring vaccination had worked in Brazil, Indonesia, across Africa. And they ran into resistance they never anticipated. In some regions, the harder the campaign pushed, the more communities evaded the teams. People hid their sick. Trust eroded. What changed? Someone finally asked: Why are some people resisting? They learned about Shitala Mata—the goddess associated with smallpox. They learned about caste dynamics and purdah customs. They learned that male vaccinators couldn’t reach women in conservative households. The story they told themselves: “We have the vaccine. Resistance is a problem to be overcome.” The permission: Our map is accurate. Reality should comply. The map was medically correct. The territory was socially, religiously, and historically layered in ways the map didn’t capture. Success came when they adapted by working with local health workers, female vaccinators, and community leadership. Same vaccine. Same goal. Different implementation. Different outcome. When to Use ItBefore any decision where you feel certain, especially one that feels obviously right. The feeling of completeness is the trap. You’ve answered the question. You’ve solved the problem. The balloons go up. What more is there to consider? That’s exactly when you need to ask: And then what? How to Use ItBefore your next decision, ask: If this works exactly as planned, what happens next? And what happens after that? Who might be affected that I haven’t considered? Or even simpler: If this works perfectly, what breaks? Before you approve any plan (especially one for a “good cause”), write down three ways it could succeed and cause harm. Three ways the first-order win creates a second-order loss. If you can’t think of any, you haven’t thought hard enough. The inability to imagine failure is the first sign you’re in the danger zone. Run a pre-mortem. Assume the plan worked and caused damage anyway. What went wrong? Next StepThis week, take one decision you’re confident about. Write down what happens if it works perfectly—then write down what breaks as a result. That single question—what breaks?—is the difference between planning with confidence and planning with rigor. Where It Came FromSecond-order thinking traces back to systems theory and the study of unintended consequences. The term “Cobra Effect” was coined by economist Horst Siebert in 2001 to describe interventions that make problems worse—such as the bounty for dead cobras in British India, which allegedly led to cobra breeding. The principle appears across domains: in ecology (invasive species introduced to solve one problem creating new ones), in economics (Prohibition creating organized crime). Howard Marks popularized the concept in investing, distinguishing first-level thinkers (“This is a good company, let’s buy”) from second-level thinkers (“Everyone thinks it’s good, so it’s overpriced”). You can listen to the full episode of Doomed to Fail wherever you get your podcasts. Until next time, keep questioning. Your mind is the last territory you truly control. Think Independently, JC Share or Join 👉
|
Re:Mind is a weekly newsletter exploring mental models and frameworks that help you think clearly and make better decisions. Each week, I share practical insights and tools that transform complex ideas into wisdom you can apply immediately. Join me in making better decisions, together.
Hello Reader, Here's the pattern: you set a target. You hit it for a week, maybe two. Then a bad day arrives (low energy, fractured schedule, unexpected chaos) and you miss. Not by much. But by the goal's own logic, a miss is a miss. So you log the failure. Then you miss again. Then the goal quietly dies, buried under a pile of "not todays." The goal didn't fail because you lacked discipline. It failed because it only had two states: perfect or pointless. And that binary is a trap. The ABC...
Hello Reader, Austria and Germany share a border, similar cultures, and comparable healthcare systems. Ask citizens of both countries whether they support organ donation, and roughly 85% say yes. Yet Austria has a 99% organ donor registration rate, and Germany sits at 12%. Same values. Same medical infrastructure. Opposite outcomes. The difference? Austria uses opt-out registration. Germany uses opt-in. In Austria, you are considered a donor unless you actively decline. In Germany, you're not...
Hello Reader, February 1, 2003. Space Shuttle Columbia disintegrated over Texas during reentry. Seven astronauts die. NASA investigators later traced the failure to a foam strike during launch (a piece of insulation that hit the wing at 500 mph). Engineers knew about the strike. They analyzed it. They presented their findings to management sixteen days before the disaster. Their slide said: "Review of test data indicates conservatism for tile penetration." Management saw no red flags. The...