Dangerous systems usually required standardized procedures and some form of centralized control to prevent mistakes. That sort of management was likely to work well during routine operations. But during an accident, Perrow argued, “those closest to the system, the operators, have to be able to take independent and sometimes quite creative action.” Few bureaucracies were flexible enough to allow both centralized and decentralized decision making, especially in a crisis that could threaten hundreds or thousands of lives. And the large bureaucracies necessary to run high-risk systems usually resented criticism, feeling threatened by any challenge to their authority. “Time and time again, warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced,” Perrow found. The instinct to blame the people at the bottom not only protected those at the top, it also obscured an underlying truth. The fallibility of human beings guarantees that no technological system will ever be infallible.
• • •
AFTER SERVING AS A CONSULTANT to the Joint Chiefs of Staff on strategic nuclear policy, Scott D. Sagan applied “normal accident” theory to the workings of the American command-and-control system during the Cuban Missile Crisis. According to Sagan, now a professor of political science at Stanford University, the crisis was the most severe test of that system during the Cold War, “the highest state of readiness for nuclear war that U.S. military forces have ever attained and the longest period of time (thirty days) that they have maintained an alert.” Most historians attributed the peaceful resolution of the crisis to decisions made by John F. Kennedy and Nikita Khrushchev — to the rational behavior of leaders controlling their military forces. But that sense of control may have been illusory, Sagan argued in The Limits of Safety , and the Cuban Missile Crisis could have ended with a nuclear war, despite the wishes of Khrushchev and Kennedy.
With hundreds of bombers, missiles, and naval vessels prepared to strike, the risk of accidents and misunderstandings was ever present. At the height of the confrontation, while Kennedy and his advisers were preoccupied with the Soviet missiles in Cuba, an Atlas long-range missile was test-launched at Vandenberg Air Force Base, without the president’s knowledge or approval. Other missiles at Vandenberg had already been placed on alert with nuclear warheads — and the Soviet Union could have viewed the Atlas launch as the beginning of an attack. The Jupiter missiles in Turkey were an issue of great concern to Secretary of Defense Robert McNamara throughout the crisis. McNamara ordered American troops to sabotage the missiles if Turkey seemed ready to launch them. But he was apparently unaware that nuclear weapons had been loaded onto fighter planes in Turkey. The control of those weapons was “so loose, it jars your imagination,” Lieutenant Colonel Robert B. Melgard, the commander of the NATO squadron, told Sagan. “In retrospect,” Melgard said, “there were some guys you wouldn’t trust with a .22 rifle, much less a thermonuclear bomb.”
During one of the most dangerous incidents, Major Charles Maultsby, the pilot of an American U-2 spy plane, got lost and inadvertently strayed into Soviet airspace. His mistake occurred on October 27, 1962 — the same day as the Atlas missile launch and the shooting down of a U-2 over Cuba. Maultsby was supposed to collect air samples above the North Pole, seeking radioactive evidence of a Soviet nuclear test. But the flight path was new, the aurora borealis interfered with his attempt at celestial navigation, and Maultsby soon found himself flying over Siberia, pursued by Soviet fighter planes. The U-2 ran out of fuel, and American fighters took off to escort Maultsby back to Alaska. Under the DEFCON 3 rules of engagement, the American fighter pilots had the authority to fire their atomic antiaircraft missiles and shoot down the Soviet planes. A dogfight between the two air forces was somehow avoided, the U-2 landed safely — and McNamara immediately halted the air sampling program. Nobody at the Pentagon had considered the possibility that these routine U-2 flights could lead to the use of nuclear weapons.
America’s command-and-control system operated safely during the crisis, Sagan found, and yet “numerous dangerous incidents… occurred despite all the efforts of senior authorities to prevent them.” He’d long believed that the risk of nuclear weapon accidents was remote, that nuclear weapons had been “a stabilizing force” in international relations, reducing the risk of war between the United States and the Soviet Union. “Nuclear weapons may well have made deliberate war less likely,” Sagan now thought, “but, the complex and tightly coupled nuclear arsenal we have constructed has simultaneously made accidental war more likely.” Researching The Limits of Safety left him feeling pessimistic about our ability to control high-risk technologies. The fact that a catastrophic accident with a nuclear weapon has never occurred, Sagan wrote, can be explained less by “good design than good fortune.”
• • •
THE TITAN II EXPLOSION at Damascus was a normal accident, set in motion by a trivial event (the dropped socket) and caused by a tightly coupled, interactive system (the fuel leak that raised the temperature in the silo, making an oxidizer leak more likely). That system was also overly complex (the officers and technicians in the control center couldn’t determine what was happening inside the silo). Warnings had been ignored, unnecessary risks taken, sloppy work done. And crucial decisions were made by a commanding officer, more than five hundred miles from the scene, who had little firsthand knowledge of the system. The missile might have exploded no matter what was done after its stage 1 fuel tank began to leak. But to blame the socket, or the person who dropped it, for that explosion is to misunderstand how the Titan II missile system really worked. Oxidizer leaks and other close calls plagued the Titan II until the last one was removed from a silo, northwest of Judsonia, Arkansas, in June 1987. None of those leaks and accidents led to a nuclear disaster. But if one had, the disaster wouldn’t have been inexplicable or hard to comprehend. It would have made perfect sense.
The nuclear weapon systems that Bob Peurifoy, Bill Stevens, and Stan Spray struggled to make safer were also tightly coupled, interactive, and complex. They were prone to “common-mode failures” — one problem could swiftly lead to many others. The steady application of high temperature to the surface of a Mark 28 bomb could disable its safety mechanisms, arm it, and then set it off. “Fixes, including safety devices, sometimes create new accidents,” Charles Perrow warned, “and quite often merely allow those in charge to run the system faster, or in worse weather, or with bigger explosives.” Perrow was not referring to the use of sealed-pit weapons during SAC’s airborne alerts. But he might as well have been. Promoted as being much safer than the weapons they replaced, the early sealed-pit bombs posed a grave risk of accidental detonation and plutonium scattering. Normal accident theory isn’t a condemnation of modern technological systems. But it calls for more humility in how we design, build, and operate them.
The title of an influential essay on the role of technology in society asked the question: “Do Artifacts Have Politics?” According to its author, Langdon Winner, the answer is yes — the things that we produce are not only shaped by social forces, they also help to mold the political life of a society. Some technologies are flexible and can thrive equally well in democratic or totalitarian countries. But Winner pointed to one invention that could never be managed with a completely open, democratic spirit: the atomic bomb. “As long as it exists at all, its lethal properties demand that it be controlled by a centralized, rigidly hierarchical chain of command closed to all influences that might make its workings unpredictable,” Winner wrote. “The internal social system of the bomb must be authoritarian; there is no other way.”
Читать дальше