A Brief History
On July 16, 1945, Manhattan Project scientists held their breath as the clock ticked down to the first man-made nuclear blast in history.
Over a period of almost 6 years from its feeble first steps (3 years as a project in earnest), through 130,000 people working on the project and $2 billion taxpayer dollars the finest scientists in the world had developed methods of enriching uranium to a state where its nucleus could be split and creating plutonium, the 2 materials needed for the 2 different types of atomic weapons being considered.
The uranium device would be a tube in which 2 chunks of enriched uranium would be launched at each other at high speed by conventional explosives, causing a critical mass to form in the blink of an eye, triggering a nuclear blast. This type of weapon is often referred to as the “gun” type.
The plutonium device would be a hollow ball of plutonium with precision explosives around the outside meticulously timed to blow up all at the same time causing the hollow sphere to implode, creating a critical mass in a fraction of a second and subsequently the desired nuclear blast, what is known as an “implosion” device.
(Note: Obviously, the descriptions of how nuclear bombs work are greatly simplified and the preceding paragraph is paraphrased in common vernacular.)
President Roosevelt had been warned by Albert Einstein that Germany (and maybe Japan) would be working on developing nuclear weapons and that if the US and Allies did not want to get blown off the map, we better develop such weapons first.
At 5:30 am on July 16, 1945, the entire point of the Manhattan Project was on the line as a plutonium implosion device suspended 100 feet above the desert was exploded. Although the nuclear physicists on the project were reasonably confident of their calculations, no one knew for sure how big the blast would be and whether or not the atmosphere would become part of the chain reaction, ending mankind. When the brilliant fireball and mighty blast went off, the equivalent of 20,000 tons of TNT, it left a 250 feet wide crater in the desert (with sand fused to glass), a mushroom cloud 7 ½ miles high, and the blast could be felt 100 miles away. People as far away as El Paso could hear the explosion.
The scientists and budget planners were right; a practical bomb could be made, and it would be a city destroyer. Now the question was, how and if to use it. Despite some opposition, and some sentiment toward giving the world a demonstration over an unoccupied target, President Truman and his advisers decided Japan must have a city destroyed by an atom bomb to convince them to surrender. The debate over whether or not this was necessary still rages today, with critics claiming the Japanese were on the brink of surrender anyway, and proponents saying that the terrible price paid to conquer Okinawa showed that an invasion of Japan would cost tens of thousands of American lives, probably hundreds of thousands. Besides, the Soviets were poised to make a land grab of as much Japanese territory as possible, and US planners may well have intended to impress and intimidate the Soviets as much as the Japanese.
Less than a month after Trinity, 2 Japanese cities lay in smoking ruins, and over 100,000 Japanese were dead, and more were dying.
Was that the right thing to do? Give us your opinion. (In other articles we have discussed the morality of using atomic bombs on Japanese cities, including “August 9, 1945: Second Atomic Bomb Dropped on Nagasaki, but Was It Necessary?“, “October 22, 1943: What is Firebombing? Tactic or Terrorism? Allied Firestorm Bombing in World War II,” and also “July 23, 2014: 10 Things History Got Wrong!” among other times we have talked about the morals of bombing cities and the justification/rationalization for doing so.)
For more information, please see…
You can also watch a video version of this article on YouTube.