One of the main themes of this book is the difficulty of controlling complex, high-risk technologies. I’ve never had much patience for theories of historical inevitability — and in recent years a number of scholars have applied a healthy skepticism to the traditional view that scientific inventions are somehow the logical, necessary result of some previous development. They have challenged a simplistic technological determinism, suggesting that every manmade artifact is created within a specific social context. Donald MacKenzie, a professor of sociology at Edinburgh University, greatly influenced my thinking about how and why new inventions are made. MacKenzie has edited, with Judy Wajcmann, a fine collection that explores some of these ideas: The Social Shaping of Technology: Second Edition (New York: Open University Press, 1999). MacKenzie has also written a brilliant, thought-provoking book on the ways in which American targeting decisions improved the likelihood that a warhead would hit its target — Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance (Cambridge, MA: MIT Press, 1993). His views on the process of scientific and technological change resonate strongly with one of my own long-standing beliefs: if things aren’t inevitable, then things don’t have to be the way they are. Without being utopian or overly optimistic, MacKenzie and Graham Spinardi applied that sort of thinking to weapons of mass destruction, after interviewing dozens of scientists at Los Alamos and Lawrence Livermore, in their essay “Tacit Knowledge and the Uninvention of Nuclear Weapons.” It can be found in MacKenzie’s book Knowing Machines: Essays on Technical Change (Cambridge, MA: MIT Press, 1998).
Many of the declassified documents cited in this book were found online. Two of the best sites for historical material are the Pentagon’s Defense Technical Information Center, “Provider of DoD Technical Information to Support the WarFighter,” and the U.S. Department of Energy’s OpenNet. L. Douglas Kenney — the author of 15 Minutes: General Curtis LeMay and the Countdown to Nuclear Annihilation (New York: St. Martin’s Press, 2011) — has posted a few Strategic Air Command official histories online that I found quite useful. A Web site called the Black Vault also features a wide variety of declassified documents. And the Federation of American Scientists is an excellent online source for information about nuclear weapons.
I am especially grateful for the work of the National Security Archive, based at George Washington University, which for almost three decades has been obtaining documents through the Freedom of Information Act and suing federal agencies when they are denied — not only to reveal what the government has done but also to hold it accountable for that behavior. The archive is a national treasure. Its digital collection proved invaluable to my research. William Burr, the director of its nuclear project, has done an extraordinary job of uncovering and explaining some of the more significant documents. With the head of the archive, Thomas S. Blanton, and Stephen I. Schwartz, Burr wrote a fine essay that explains why freedom of information is so essential: “The Costs and Consequences of Nuclear Secrecy,” in Atomic Audit , pages 433–483. Throughout my bibliography and endnotes I have used the acronym NSA to identify documents originally obtained by the National Security Archive.
Prior to the publication of this book, I gave a rough draft of it to a nuclear weapons expert who is not employed by the U.S. government and yet possesses a high-level clearance. I wanted to feel confident that nothing disclosed in these pages would pose any threat to national security. My unpaid but much appreciated reader found nothing that even remotely does — and I agree with him. A far greater threat has been posed, for the past sixty years, by official secrecy and misinformation about America’s nuclear arsenal. The suppression of the truth has allowed a small and elite group of policy makers to wield tremendous, largely unchecked power. There are few issues more important than what nuclear weapons can do, where they are aimed, why they might be used, and who has the ability to order their use. I hope my book contributes, in some small way, to restoring a semblance of democracy to the command and control of the deadliest, most dangerous machines that mankind has ever invented.
Senior Airman David F. Powell and Airman Jeffrey L. Plumb: I spoke to Plumb and Powell about the accident. Plumb’s statement before the Missile Accident Investigation Board can be found at Tab U-71 and Powell’s at Tab U-73, “Report of Missile Accident Investigation: Major Missile Accident, 18–19 September 1980, Titan II Complex 374-7, Assigned to 308th Strategic Missile Wing, Little Rock Air Force Base, Arkansas,” conducted at Little Rock Air Force Base, Arkansas, and Barksdale Air Force Base, Louisiana, December 14–19, 1980.
10 feet in diameter and 103 feet tall: According to the Titan II historian David K. Stumpf, the height of the missile was often erroneously described as “anywhere from 108 feet to 114 feet.” The actual height was 103.4 feet. See “Table 3.2, Titan II ICBM Final Design Specifications,” in David K. Stumpf, Titan II: A History of a Cold War Missile Program (Fayetteville: University of Arkansas Press, 2000), p. 49.
a yield of 9 megatons: The yields of American nuclear weapons remain classified, except for those of the bombs that destroyed Hiroshima and Nagasaki. But for decades government officials have discussed those yields, off the record, with journalists. Throughout this book, I cite the weapon yields published by a pair of reliable defense analysts. For some reason, the megatonnage of the warheads carried by the Titan and Titan II missiles was disclosed in a document obtained by the National Security Archive through the Freedom of Information Act. For the yields of the W-38 warhead atop the Titan and the W-53 atop the Titan II, see “Missile Procurement, Air Force,” U.S. Congress, House Committee on Appropriations, Subcommittee on Defense, May 16, 1961 (SECRET/declassified), NSA, p. 523. For the yields of other American weapons, see Norman Polmar and Robert S. Norris, The U.S. Nuclear Arsenal: A History of Weapons and Delivery Systems Since 1945 (Annapolis, MD: Naval Institute Press, 2009), pp. 1–70.
about three times the explosive force of all the bombs: Although estimates vary, the American physicist Richard L. Garwin and the Russian physicist Andrei Sakharov both noted that the explosive force of all the bombs used during the Second World War was about 3 megatons. The United States was responsible for most of it. According to Senator Stuart Symington, who’d served as the first secretary of the Air Force after the war, the bombs dropped by the United States had a cumulative force of 2.1 megatons. Two thirds of that amount was employed against Germany, the rest against Japan. The enormous power of the Titan II’s warhead seems hard to comprehend. Nine megatons is the equivalent of eighteen billion pounds of TNT — about four pounds of high explosives for every person alive in September 1980. Symington’s estimates can be found in “Military Applications of Nuclear Technology,” Hearing Before the Subcommittee on Atomic Energy, 93rd Cong., April 16, 1973, pt. 1, pp. 3–4. For the other estimates, see Richard L. Garwin, “New Weapons/Old Doctrines: Strategic Warfare in the 1980s,” Proceedings of the American Philosophical Society , vol. 124, no. 4 (1980), p. 262; and Andrei Sakharov, “The Danger of Thermonuclear War,” Foreign Affairs , Summer 1983, p. 1002.
Читать дальше