Book Notes

Notes and ratings on books I've read

jordanwackett.com →
← All books

Challenger

by Adam Higginbotham

8/10
Read: September 2025
TechnologyHistoryScience

Higginbotham, an authority on space history, argues that the failures of NASA's Space Shuttle program stemmed from compounding trade-offs and cultural blind spots rather than single decisions. The book introduces concepts such as acceptable risk, the importance of prototyping culture, feedback loops in systems, and the influence of organizational culture on decision-making.

Overview

Adam Higginbotham traces the full arc of America's Space Shuttle program, from the scrappy ingenuity of early NASA engineering through the organizational failures that led to the Challenger and Columbia disasters. The book is a masterclass in how complex systems fail: not because of a single bad decision, but because of compounding trade-offs, cultural blind spots, and the slow normalization of risk. If you care about how organizations design for safety (or don't), this is essential reading.


Key Takeaways

  • "Acceptable risk" is a design decision. NASA knowingly chose a pure-oxygen atmosphere despite documented flammability risks. The language of "acceptable risk" masked what was really happening: a trade-off between engineering safety and schedule pressure. In healthcare, we see this pattern constantly. The question isn't whether risk exists; it's whether the people making the trade-off are the same ones bearing the consequences.
  • Prototyping culture drives innovation. Max Faget, the engineer behind the Shuttle's aerodynamic concept, tested ideas by throwing balsa wood models and paper plates across rooms. His willingness to prototype cheaply and iterate fast produced breakthroughs that formal processes alone never would have. This is the core of design thinking: make it tangible, test it early, learn before the stakes are high.
  • Systems fail when feedback loops break. NASA's simulation teams never designed scenarios with no escape. "All-up testing" skipped component-level validation entirely. These choices eliminated the very feedback mechanisms that could have surfaced problems early. In any complex system (spaceflight, emergency medicine, health IT), resilience depends on designing honest feedback into every layer.
  • Culture shapes what people are willing to say. From John Glenn's dismissal of women in spaceflight to the organizational dynamics that silenced dissent before both shuttle disasters, the book shows how social norms determine which information reaches decision-makers. Facilitating conditions aren't just about tools and training; they're about whether people feel safe raising concerns.

Connections

  • Human-centered design in high-stakes environments. The book reinforces that "user error" is almost always a system design failure. Connects directly to Don Norman's work on affordances and the gulf of execution/evaluation.
  • UTAUT and technology adoption. NASA's adoption of the Shuttle as an operational vehicle (before it was truly ready) mirrors patterns in health IT adoption: performance expectancy was high, but facilitating conditions and social influence distorted honest risk assessment.
  • Normalization of deviance. Diane Vaughan's framework (developed from studying the Challenger disaster itself) is the bridge between this book and implementation science. When workarounds become routine, the system loses its ability to detect drift toward failure.
  • AI scribe parallels. "All-up testing" has echoes in how AI tools get deployed in clinical settings: ship the full system, skip the component-level validation, hope the integration holds. Worth revisiting when writing about adoption barriers.

Quotable

"In the complex web of engineering compromises necessary to build a capsule light and practical enough to carry three men to the moon, on an almost impossible deadline, but without unduly endangering the lives of the crew, this was deemed an 'acceptable risk.'" "The supervisors never devised scenarios from which they knew there was no escape, or would inevitably end in what NASA engineers referred to as 'loss of vehicle, mission and crew': certain death."


Recommended for: Anyone working in complex systems, patient safety, implementation science, or organizational design. Essential for designers and clinicians who want to understand how catastrophic failures emerge from reasonable-seeming decisions.

As an Amazon Associate I earn from qualifying purchases.