Just like Three Mile Island, the disaster at Fukushima marks a new era in the atomic age.

On Wednesday, March 28, 1979, an accident occurred at the Three Mile Island nuclear facility outside of Harrisburg, Pennsylvania. A valve that was supposed to close remained open, permitting large amounts of water—normally used to cool the plant’s core—to escape. For several hours, operators did not realize that the valve was open, and, as the containment building lost coolant, both temperatures and radiation levels rose. The plant began leaking radiation into the surrounding air and water, and, by Friday, a hydrogen bubble had developed in the top of the core’s container, making it difficult for workers to bring down the core’s temperature and stoking fears of an explosion. That evening, CBS news anchor Walter Cronkite told viewers that the nation faced “the considerable uncertainties and dangers of the worst nuclear power plant accident of the atomic age. And the horror tonight is that it could get much worse.”

The accident did not get worse. No explosion occurred. The crisis at the Fukushima Daiichi nuclear plant is already far more severe, and—as of this writing—could lead to a nuclear meltdown. But there are important resemblances between the two accidents—not in their severity, but in the deep-seated fears that they immediately spawned. And there is a reason for this resemblance. In 1945, the atomic age was born when the United States dropped two nuclear bombs on Japan; in both countries, the fear of nuclear war endured for many decades. At the same time, proponents of civilian nuclear energy attempted to dissociate nuclear reactors from weapons. But the accidents at Three Mile Island and at Fukushima undermined industry efforts to sever the link between power plants and bombs, re-awakened longstanding fears of the nuclear threat, and fueled anxieties about the dangers of radiation.

In the United States, the fear of a nuclear bomb did not begin in 1945, but with the nuclear arms race against the Soviet Union and the acceleration of nuclear testing in the 1950s. The arms race—and growing tensions between the two countries—awakened fears that the nation could suffer a nuclear attack. The civil defense programs of the 1950s made air-raid drills, duck-and-cover films, and radio-alert systems a familiar—if also unnerving—feature of everyday life.

Nuclear testing sowed fear that radiation could seep into the atmosphere and harm human and animal health. In 1953, stockmen in Utah blamed nuclear testing in neighboring Nevada for the deaths of over 1,000 ewes and lambs. The following year, a test series spread radioactive ash over 7,000 square miles of the Pacific Ocean, exposing over 250 people on the Marshall Islands to radiation. A Japanese fishing boat was also in the plume’s path, and 23 fishermen suffered radiation illness; one of them died. In 1955, radioactive rain fell in Chicago, and, four years later, strontium-90 (a long-lasting isotope that can lodge in the bones) was detected in wheat and milk. Throughout the late 1950s, scientists warned that radiation could cause leukemia, bone cancer, and genetic damage, citizen groups conducted local studies on radiation exposure, and The Saturday Evening Post described radioactive fallout as “the silent killer.”

Yet, as this was happening, the government was promoting the civilian application of nuclear power. In 1953, Eisenhower launched the international “Atoms for Peace” program, and the first nuclear power plant opened in the United States in 1957. The construction of nuclear power plants sped up throughout the 1960s, and, by 1973, there were 37 domestic nuclear power plants in operation, with more in the pipeline. Through an ambitious public relations campaign, the government sought to transform the “destructive atom” into a “peaceful” and “benevolent” one that could provide energy that would be, in the industry’s words, “too cheap to meter.”

At the heart of this transformation was the claim that radiation was a part of—rather than a threat to—the natural world. Drawing on a turn-of-the-century fascination with radium as an elixir and healer, proponents of nuclear power attempted to sever atomic energy from the wartime destruction that it had wrought.  Simultaneously, the industry expressed extraordinary faith in plant design, declaring that the installment of redundant safety systems made an accident impossible. As one regulator recalled after the Three Mile Island accident, “no one really thought you could have a core meltdown. It was more a Titanic sort of mentality. This plant was so well designed that you couldn’t possibly have serious core damage.” One 1975 study sponsored by the Nuclear Regulatory Commission concluded that a citizen was more likely to be killed by a meteor than by a reactor accident. The belief that nuclear power could be completely disassociated from its violent origins, combined with the industry’s remarkable over-confidence about plant safety, created the illusion that dangerous levels of radiation could emanate from bombs, but not from nuclear power plants.

The Three Mile Island accident changed that. The dual threat of a meltdown and an explosion unleashed the fear that the industry had tried to suppress. Power plants could behave like bombs. They could explode; they could poison people and animals; they could contaminate the land. The accident transformed the Susquehanna River Valley into a scene out of a science fiction film: Officials handed out Geiger counters and gas masks, portable body detectors were brought in for the full-body counting of local residents, and the state stockpiled potassium iodine, which can block the thyroid’s absorption of Iodine-131 (a radioactive isotope). Americans watched the crisis unfold on the evening news, and polling conducted at the time suggested that, despite the industry’s efforts to convince them otherwise, the public believed that a plant explosion would mimic a bomb attack.

Local radio stations were deluged with phone calls from local residents: If people evacuated, would they ever be able to come home? What were the symptoms of radiation sickness? How long would food and water supplies be contaminated? Was it true that a meltdown would render the area uninhabitable for 100 years? Evoking the aftermath of the bombings of Hiroshima and Nagasaki, local people expressed the fear that the accident would leave the landscape barren and stripped of all life. Many feared that they would leave home and never be able to come back. And, just as we are now seeing in Japan, that fear was often combined with a mounting distrust toward both the utility company that operated the plant and the federal officials who regulated it. As one local woman put it, Friday, March 30 was “the last day in my life I’ll ever trust the utility or our government to do the right thing for me.”


The same thing is happening now in the midst of the crisis at Fukushima Daiichi. Japan, the site of the only nuclear bomb attack in world history, is today among the top consumers of nuclear energy; and the experience with atomic energy parallels that of the United States. The 1945 bombings of Hiroshima and Nagasaki haunted postwar Japan but did not deter it from pursuing nuclear power as an alternative to oil and coal. Resource poor, Japan established its domestic nuclear power program in 1954, and its first power plant went on line in 1966. Like their American counterparts, industry proponents sought to distance plants from bombs. The country’s 1955 Atomic Energy Basic Law stipulated that nuclear power could only be used for peaceful purposes. While the accidents at Three Mile Island and Chernobyl halted new plant construction in other countries, they had little effect on the industry in Japan. Even reports about industry accidents, cover-ups, and scandals since the mid-1990s have not deterred the building of new plants. Before the earthquake and tsunami, 54 reactors provided 30 percent of the country’s electricity.

Like the accident at Three Mile Island, the accident at Fukushima Daiichi—whatever its final outcome—will mark the end of an era in the history of civilian nuclear energy. Images of Japanese children and babies being tested for radiation, men and women evacuating the region around the plant, and the possibility of long-term contamination of air and water have conjured memories of the 1945 bombings and reminded the public of what the industry has tried so hard to make them forget: that nuclear power, like nuclear weapons, can pose a grave threat. Part of the anguish of watching this particular accident from afar is that it brings the story full-circle: The most serious nuclear accident of the new century is taking place on the very island where the atomic age was born.

Natasha Zaretsky is an associate professor of history at Southern Illinois University at Carbondale. She is the author of No Direction Home: The American Family and the Fear of National Decline, 1968-1980 and is currently writing a book about the 1979 accident at Three Mile Island.

For more TNR, become a fan on Facebook and follow us on Twitter.