Friday, November 17, 2023

"When Systems Fracture: On the tendency of advanced technology to promote self-deception"

From Harvard Magazine, November - December 2001:

As the enormity of September 11 sank in, a fertilizer factory near Toulouse, France, exploded, killing 29 people and hospitalizing at least 780. The first event was terrorism; French authorities say the other was almost certainly an accident. But think of the technological vulnerabilities the assailants exploited: hundred-story towers with single exits, jumbo jets loaded with fuel. Like potentially explosive chemical plants, these engieering landmarks have become part of the fabric of advanced industrial society.

The independent writer James R. Chiles '77 completed Inviting Disaster: Lessons from the Edge of Technology after the happy outcome of the Year 2000 Crisis had calmed technological anxiety. Security against attack is not his major concern. But his book reminds us that even without lethal fanaticism, the human-made world is more dangerous than ever. Technological risk has not vanished, and indeed the number of disasters and fatalities has multiplied. There are more potentially hostile states with intercontinental ballistic missiles--meaning more risk of rapid launches and false warnings like those that indicated Soviet attacks in 1979 and 1980. Chemical plants and electric-generating systems operate with unprecedented pressures and temperatures. Superjumbo aircraft and 10,000-container tankers are on the way. Freak accidents can cripple the most sophisticated technology. Last year, an 18-inch strip of titanium on a Paris runway triggered a chain reaction of failures in an Air France Concorde, leading to flames, loss of control, and 113 deaths.

Chiles sees these catastrophes as "system fractures," comparing technical defects to the tiny cracks that appear in newly milled aluminum. Harmless initially, they can grow and propagate, especially if the surface is allowed to corrode or if it is cut the wrong way: as was demonstrated when square windows, originally a stunning design innovation, inadvertently promoted a series of tragic breakups of the de Havilland Comet, and with it the British lead in commercial jet aviation. The safety of complex systems demands a series of technical and human measures to keep the cracks from spreading. Inviting Disaster is not just an anatomy of failure but a study of successful "crackstopping." Thus, while acknowledging the contributions of the sociologist Charles Perrow, who has argued that some technologies make disasters almost inevitable, Chiles turns more to the classic study of an aircraft carrier as a "self-designing high-reliability organization" by the political scientists Gene Rochlin and Todd LaPorte and the psychologist K.H. Roberts.

Inviting Disaster takes a fresh approach to familiar tragedy. We think we know all about the Challenger disaster, but Chiles shows how many of the same problems, especially the cost and deadline pressures that prevented the solution of technical problems, helped doom a British airship, the R.101, more than 40 years earlier: pathologies of national prestige technology.

Even without political demands for results, failure also awaits organizations that neglect testing. Chiles tells the stunning story of the Newport Torpedo Station in Rhode Island and its miracle weapon, a proximity fuse triggered for maximum damage by an enemy ship's interaction with the earth's magnetic field. Never tested under battle conditions during the low-budget interwar years, the Mark 14 torpedo failed to explode when finally used in World War II. Its contact detonator, too, failed on impact. As the Hubble Space Telescope later showed, only rigorous testing can check the tendency of advanced technology to promote self-deception. Nor can we count on even highly trained men and women to react correctly under stress; hypervigilance and fixation on assumed solutions can easily defeat common sense. And both machine performance and human judgment can degrade disastrously and suddenly when certain thresholds--in the machines' case, marked by red lines on gauges--are exceeded.

Some organizations have been more able than others to deal with perils like these. Chiles calls them "crack-stopper companies." They encourage employees to admit errors and report their mistakes without fear of reprisals. They assign teams of specialists to comb assembled aircraft for loose parts, debris, and tools that could later prove fatal. In case of doubt, they assume the worst, tearing down and reconstructing systems as Admiral Hyman Rickover, one of the author's heroes, did when nuclear submarines were under construction and compliance of tubing with a specification could not be verified....

....MUCH MORE