Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.
The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.
These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.
Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee