As we mark the seventy-fourth anniversary of the Hiroshima and Nagasaki bombings in a handful of days, we will rightly remember the horrors of nuclear war.
For a brief fraction of a second on an early March morning in 1954, the United States summoned a second sun into existence above Bikini Atoll.
As the four-mile wide fireball bathed the Pacific seascape in its angry, white-red light, onlookers recognized something nearly divine—and unquestionably ominous. “It was a religious experience, a personal view of the apocalypse or transfiguration,” said one observer. Another remembered feeling “like you stepped into a blast furnace,” even though he was over thirty miles away.
This was the Castle Bravo thermonuclear test, one of several dozen nuclear detonations the United States carried out in the Marshall Islands during the Cold War. At 15 million tons of TNT—one thousand times more powerful than the bomb that destroyed Hiroshima—it was the largest explosion ever set off by Americans.
It was also the dirtiest, as a new study published this month shows. Researchers from Columbia University, analyzing soil samples from several Marshall Island atolls, discovered widespread radioactivity. Bikini Island itself was declared unsafe for human habitation, while the three other atolls had significant radionuclide concentrations—mainly americium, cesium, and plutonium. In some cases, the level of radioactivity—more than sixty years since the last mushroom cloud loomed over Bikini’s azure lagoon—exceeded that found at Chernobyl or Fukushima.
The process that led to this long-standing radioactivity is relatively simple, even if it wasn’t fully understood as the Cold War heated up. As the Castle Bravo fireball ascended into the sky, it carried with it tons of vaporized coral, rock, and dirt. This debris intermingled with radioactive isotopes before settling back down to the ground as deadly fallout. In one case, ignorant of its lethal effects, children on a neighboring atoll played in the falling powder, believing it was snow.
But American nuclear testing didn’t just occur in the middle of the Pacific. Throughout the Cold War, the United States detonated hundreds of atomic bombs in Nevada at a test site just northwest of Las Vegas. Many of these tests were above-ground, exposing the continental United States to the same radioactive fallout that fell over those remote atolls.
As with the Marshall Islands, the radiological effects of this testing were widespread—and immense. A 2017 study from the University of Arizona suggested that the fallout generated by the Nevada nuclear explosions exposed millions of Americans to its lethal radiation.
The exposure mechanism wasn’t always direct, either. Once caught in high-altitude winds, fallout from these tests would travel for hundreds or even thousands of miles before settling back down over the vast fields of the American heartland. Unsuspecting cattle would graze on grass freshly laced with this fallout, including Iodine-131, a highly potent radionuclide that spews beta and gamma radiation.
The cows concentrated this iodine in their milk, which would then be quickly consumed by the local population through the dairy industry. Because they’re chemically indistinguishable, the human body can’t tell the difference between normal iodine and the radioactive variety, and so deposits both in the thyroid gland. Once securely lodged there, Iodine-131 bombards nearby tissue on a cellular level, damaging DNA strands and eventually causing cancer.
This was just one of the exposure mechanisms; there were many others. Taken together, the 2017 study suggested that fallout from the Nevada nuclear testing could have led to between 340,000 and 460,000 premature deaths, mostly Americans and mainly through cancer.
If that death toll seems unreal, consider the scale of the radiation involved. Using a figure of 81 million Curies for the radioactive material released at Chernobyl as a baseline, one estimate held that the Nevada-based nuclear testing emitted 12 billion Curies into the atmosphere between 1951 and 1963. That’s the equivalent of nearly 150 Chernobyl disasters—or one a month for more than a decade. If you added in the Marshall Islands nuclear tests over this same period, that figure would be even higher.
As we mark the seventy-fourth anniversary of the Hiroshima and Nagasaki bombings in a handful of days, we will rightly remember the horrors of nuclear war. But we should also recognize the deadly tests that followed. Because any nuclear explosion—even a “peacetime test” in a Pacific paradise or the dry desert of Nevada—can put human lives at risk.
Just ask the children who played in the snow.
Zack Brown is the policy associate and special assistant to the president at Ploughshares Fund, a global security foundation that supports initiatives to reduce and eventually eliminate the dangers posed by nuclear weapons. Alex Spire is a research assistant at Ploughshares Fund.