Skip to content
Menu
menu

Photo by Matt Artz

Editor's Note: Accidents

​Officials found the nuclear warhead in a ditch in Arkansas. Once attached to a Titan II intercontinental ballistic missile (ICBM), the warhead had detached when the rocket exploded and then lay harmless in a field—100 feet from the silo in which it had been stored.

Approximately 18 hours earlier, a technician performing routine maintenance on the ICBM had dropped a wrench that punctured one of the missile’s fuel tanks, leading to a frantic effort to stop the fuel from mixing with the oxidizer and igniting the rocket.

The effort failed and the rocket exploded, killing one person and injuring 21 others. But, the worst-case scenario did not happen. The nine-megaton warhead—which had three times the explosive force of all the bombs dropped during World War II—including the atomic bombs—did not detonate.

In retrospect, the events leading up to the explosion should have served as a warning. An overworked crew was required to undertake last-minute, maintenance, then there was a delay due to a lack of parts, and new protocols required different tools—a torque wrench instead of a ratchet.

The incident and its causes are explored in depth in a recent documentary film based on the book Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety by Eric Schlosser.

“Again and again...you find an effort to blame the person who dropped the wrench, who used the wrong tool at a Minuteman site...there is an instinct to blame the operator, blame the little guy. If the system worked properly, somebody dropping a tool couldn’t send a nuclear warhead into a field,” said Schlosser.

The explosion of the Titan II in 1980 was a “normal accident,” Schlosser argues in his book. The term, devised by sociologist Charles Perrow, refers to incidents that are inevitable in complex systems. 

In complex systems, multiple failures interact with each other, failures often relate to organizations rather than technology, and big accidents almost always have small, foreseeable beginnings. “No one dreamed that when X failed, Y would also be out of order,” Perrow writes in his 1984 book Normal Accidents: Living with High-Risk Technologies. “And the two failures would interact so as to both start a fire and silence the fire alarm.”

Perrow, in turn, had never dreamed of the intricacy that would be created when all of these complex organizational systems were governed in cyberspace. The processes controlling financial systems, healthcare services, and even weapons systems are equally complex cyber systems.

In this month’s cover story, Associate Editor Megan Gates explores security’s attempts to lessen these cyberthreats, especially those posed by trusted insiders. As security experts try to harden cyber systems, they have learned that the source of the threat is not nearly as important as the potential for destruction.

arrow_upward