75 years after the first explosive nuclear tests, now outlawed, sophisticated virtual testing allows American physicists to understand these weapons better than ever.
BY: DANIEL OBERHAUS | wired.com
JUST BEFORE SUNRISE on July 16, 1945—75 years ago today—a patch of New Mexican desert was incinerated during the first trial of the most destructive weapon ever created. The plutonium bomb used for the Trinity test left a 5-foot crater in the ground and turned the surrounding desert floor into a radioactive green glass. The blast bathed the peaks of the nearby Oscura Mountains in a blinding white light, and dozens of scientific observers watching from 20 miles away reported feeling an immense heat wash over them. As the light from the explosion faded, one of the architects of the bomb, Kenneth Bainbridge, gave a pithy appraisal of the event to J. Robert Oppenheimer, the project’s lead scientist: “Now we are all sons of bitches.”
And he was right. Less than a month later, the United States dropped the same type of bomb on Nagasaki, Japan, just three days after detonating a smaller nuclear warhead over Hiroshima. It effectively ended World War II, but it came at the price of well over 100,000 civilian lives and the enduring suffering of those who survived.
The bombing of Nagasaki was the second and final time a country has deployed a nuclear weapon in combat. But it wasn’t the last nuclear explosion. Despite a lifetime of activism by Bainbridge and many of his colleagues, nuclear tests didn’t end with the war. By the time the US signed the United Nations Comprehensive Nuclear Test Ban Treaty in 1996 and agreed to stop blowing up nukes, American physicists and engineers had conducted more than 1,000 tests. They blew up nuclear weapons in the ocean. They blew them up on land. They blew them up in space. They dropped them from planes. They launched them on rockets. They buried them underground. A small army of US weapons scientists blew up a nuclear weapon every chance they got, and at the height of the nation’s testing program they were averaging one detonation per week.
The test-ban treaty was meant to end all that. Atmospheric nuclear tests have been internationally banned since the early 1960s due to health concerns about radioactive fallout and other hazards. These weren’t baseless fears. In the 1950s, US physicists drastically miscalculated the explosive yield of a thermonuclear bomb during a test in the Pacific Ocean, and the ashy radioactive fallout was detected as far away as India. Exposure to the fallout caused radiation sickness in the inhabitants of the islands around the test site, and a group of Japanese fishers suffered severe radiation burns when the fallout landed on their boat. Miscalculations of this sort were distressingly common at the time. Only a few years later, a bomber accidentally dropped a nuclear weapon on Kirtland Air Force Base on the outskirts of Albuquerque, New Mexico. (Fortunately, no one had yet loaded into the bomb the plutonium pits needed to kick off a nuclear chain reaction.)
The US signed the Partial Nuclear Test Ban Treaty—a bilateral agreement with the Soviet Union to cease above ground tests—in 1963. But nuclear testing only accelerated when it was pushed underground. The US nuclear arsenal peaked in 1967 with 31,255 warheads, and it detonated as many nukes in the 7 years after the partial test ban as it had in the previous 18 years. “With nuclear testing you were under constant pressure to design a new weapon, engineer it, put it down a hole, blow it up, and then move on to the next one,” says Hugh Gusterson, an anthropologist at the University of British Columbia and an expert on the human factors in nuclear weapons research. “The scientists didn’t have a chance to pause and catch a breath.”