Advertisement

Column: The nuclear threat the U.S. unleashed on the world 75 years ago is still every bit as terrifying

Share via

The bombs dropped by the United States on the cities of Hiroshima and Nagasaki 75 years ago, in the final days of World War II, incinerated some 200,000 people, most of them civilians.

And they did much more than that. They also transformed the nature of war, raising the specter of Armageddon and ushering in the bizarre and terrifying nuclear age that defined the Cold War over five decades.

After two back-to-back global wars, the world was used to death and destruction, but the devastation unleashed by the atom bomb in those early days of August 1945 was categorically different.

Advertisement

“A rain of ruin from the air, the like of which has never been seen on this earth” was how President Truman described the new U.S. war capability just hours after the Hiroshima bombing on Aug. 6, 1945. The co-pilot of the Enola Gay, which dropped the bomb that day from 31,500 feet, wrote in his personal log: “My God, what have we done?”

It took the Soviet Union only four years to develop an atomic bomb of its own, launching an unprecedented, ever-escalating arms race that was at the heart of the new Cold War. Both countries quickly graduated to hydrogen bombs, ultimately developing nuclear arsenals of tens of thousands of weapons, many of them with 1,000 times the power — or more — of the bomb dropped on Hiroshima.

But here was the strange paradox of nuclear weapons: Even as we built them, the overriding goal was to ensure they were never used.

Advertisement

After all, these were weapons of unprecedented power that could destroy cities and even countries, killing tens of thousands in a moment. We’d seen it happen in Hiroshima and Nagasaki, and no one wanted to see it again.

“Thus far the chief purpose of our military establishment has been to win wars,” wrote Bernard Brodie, an early nuclear strategist, in 1946. “From now on its chief purpose must be to avert them.”

But rather than stop building the weapons or destroy the ones they had, the superpowers decided, bizarrely, that the safest approach was to build ever-bigger nuclear arsenals.

Advertisement

That was counterintuitive, to say the least. But the official policy through the Cold War was one of “deterrence,” based on the idea that nuclear conflicts could best be averted if both sides were convinced that the consequences of attacking would be too horrendous to bear. You had to make it clear to your adversary that you could withstand his nuclear “first strike” — and that you could respond with a retaliatory second strike so devastating that it would be irrational for him to attack you in the first place.

Deterrence made a certain amount of sense on paper — and yet it was utterly absurd and enormously risky. Not only did it require an ever-growing stockpile of costly and hyper-destructive weapons that were not to be used, but it relied heavily on the rationality and restraint of world leaders. It presumed no misjudgments or misunderstandings.

No wonder the theory was known as Mutually Assured Destruction — or MAD.

Despite the deterrence talk, the U.S. never adopted a no-first-use policy, and there were plenty of generals and policymakers who believed a nuclear war could be fought and won. Over the years, they made secret plans for preemptive attacks. Tactical nuclear weapons were developed for use in limited wars.

As the Cold War dragged on, the nuclear “balance of terror” became part of the culture, as Americans (and Russians) adjusted to the knowledge that annihilation was an ever-present possibility. The Pentagon urged homeowners to build fallout shelters; schoolchildren were taught to “duck and cover” beneath their desks. In the late 1950s, more than 60% of American children said they’d had nightmares about nuclear war.

Movies like “Fail Safe” and “Dr. Strangelove” described the awful things that could go wrong; Bob Dylan wrote “Talkin’ World War III Blues.” The nuclear launch codes were carried in a briefcase by a military aide at the side of the U.S. president.

If Americans were frightened, they were only being reasonable. During the 1962 Cuban Missile Crisis, even President Kennedy believed the chance of a nuclear war with Russia was “between one-in-three and even.”

Advertisement

And yet today, 75 years after Hiroshima and Nagasaki, we are still here. Nuclear weapons have never again been used.

Beginning in the 1970s, treaties were negotiated by the superpowers to limit weapons growth. In the 1980s, Ronald Reagan and Mikhail Gorbachev negotiated the first pact requiring that existing nuclear weapons be destroyed. In the 1990s, the Soviet Union collapsed.

Today, the total number of warheads in the U.S. nuclear stockpile is approximately 3,800 — down from a peak of 31,255 in 1967, according to the Federation of American Scientists. Russia now has about 4,310 warheads in its stockpile.

But a multipolar world brings its own dangers. Nine countries now have nuclear weapons, including India and Pakistan, which are perpetually in conflict. Whether we can trust Kim Jong Un to behave rationally in a nuclear crisis is unclear. China, increasingly at odds with the U.S., is also, of course, a nuclear power.

The possibility of a nuclear accident remains, or of a terrorist group obtaining a weapon. The landscape of war itself is evolving, with increasing focus on cyberwar and artificial intelligence, each of which will have consequences for nuclear strategy.

Today there are other things to worry about, including the existential threat of climate change. But don’t kid yourself: The dangerous legacy of Hiroshima and Nagasaki lingers on 75 years after the dawn of the nuclear age.

Advertisement

@Nick_Goldberg

Advertisement