Skip to content
Elvis Chidera

How Complex Systems Fail — Paper Summary

summary, paper, complexity, system1 min read

Today's summary is about a paper written by Richard I. Cook (MD) in 2002.

The paper already succinctly conveys the key ideas.


  1. Complex systems are intrinsically hazardous systems.
  2. Complex systems are heavily and successfully defended (technically and ops-wise) against failure.
  3. Catastrophe requires multiple failures – single-point failures are not enough.
  4. Complex systems contain changing mixtures of failures latent within them.
  5. Complex systems run in degraded mode: Corollary of (4) — complex systems run as broken systems.
  6. Catastrophe is always just around the corner.
  7. Post-accident attribution of an accident to a ‘root cause’ is fundamentally wrong: There are multiple contributors to accidents. Each of these is necessary insufficient in itself to create an accident.

    The evaluations based on such reasoning as ‘root cause’ do not reflect a technical understanding of the nature of failure but rather the social, and cultural need to blame specific, localized forces or events for outcomes.

  8. Hindsight biases post-accident assessments of human performance: Knowledge of the outcome makes it seem that events leading to the outcome should have appeared more salient to practitioners at the time than was actually the case.
  9. Human operators have dual roles: as producers & as defenders against failure: The system practitioners operate the system in order to produce its desired product and also work to forestall accidents.
  10. All practitioner actions are gambles: All practitioner actions are actually gambles, that is, acts that take place in the face of uncertain outcomes.
  11. Actions by practitioners at the sharp end resolve all organization ambiguity.
  12. Human practitioners are the adaptable element of complex systems.
  13. Human expertise in complex systems is constantly changing as technology changes and people move.
  14. Change introduces new forms of failure: When new technologies are used to eliminate well-understood system failures or to gain high precision performance they often introduce new pathways to large-scale, catastrophic failures. Low consequence vs frequent failures.
  15. Views of ‘cause’ limit the effectiveness of defenses against future events: Instead of increasing safety, post-accident remedies usually increase the coupling and complexity of the system. Past failures are over-indexed even tho they are less likely to happen again.
  16. Safety is a characteristic of systems and not of their components.
  17. People continuously create safety.
  18. Failure-free operations require experience with failure.
© 2024 by Elvis Chidera. All rights reserved.