By Hugh Gusterson | 16 March 2011
from The Bulletin of Atomic Scientists

As an anthropologist, I am always interested in what humans learn from their mistakes. Can humans change their behavior, thereby improving their chances of survival, not just through natural selection, but also through cultural learning? Or are we hardwired to repeat our mistakes over and over, like humanoid lemmings?

More to the point, what lessons will we learn from the nuclear accident at Fukushima, an accident thought to be impossible just two weeks ago?

Some people, many of them presumably already ill-disposed toward nuclear energy, have concluded that the lesson of Fukushima is that nuclear energy is inherently dangerous. Thus, Eugene Robinson wrote in the Washington Post: “We can engineer nuclear power plants so that the chance of a Chernobyl-style disaster is almost nil. But we can’t eliminate it completely — nor can we envision every other kind of potential disaster. And where fission reactors are concerned, the worst-case scenario is so dreadful as to be unthinkable.” His colleague Anne Applebaum wrote on the same op-ed page: “If the competent and technologically brilliant Japanese can’t build a completely safe reactor, who can? … I … hope that a near-miss prompts people around the world to think twice about the true ‘price’ of nuclear energy, and that it stops the nuclear renaissance dead in its tracks.” (The nuclear renaissance comprises plans around the world to build as many as 350 new nuclear reactors, partly as a way of inhibiting climate change.)

But others have concluded that the lesson of Fukushima is not that nuclear energy technology is inherently unsafe but that this was an event unique to the Japanese context or that the industry just needs a little more oversight. Thus, Republican Sen. Mitch McConnell of Kentucky, in a comment that I confess to finding bizarre, said: “My thought about it is, we ought not to make American and domestic policy based upon an event that happened in Japan.” (Why not? The United States has two dozen reactors of the same GE design as the dangerously damaged ones in Fukushima, it has built reactors on earthquake faults, and Japanese earthquakes behave no differently than American earthquakes.) An Indian newspaper quoted Srikumar Banerjee, head of India’s Atomic Energy Commission, downplaying the Fukushima disaster as “purely a chemical reaction and not a nuclear emergency,” and saying that Indian nuclear power plants are on higher ground where tsunamis could not hurt them.

The middle ground was occupied by Democratic Rep. Ed Markey of Massachusetts, who was paraphrased in the New York Times as saying that “regulators should consider a moratorium on locating nuclear plants in seismically active areas, require stronger containment vessels in earthquake-prone regions and thoroughly review the 31 plants in the United States that use similar technology to the crippled Japanese reactors.”

We have now had four grave nuclear reactor accidents: Windscale in Britain in 1957 (the one that is never mentioned), Three Mile Island in the United States in 1979, Chernobyl in the Soviet Union in 1986, and now Fukushima. Each accident was unique, and each was supposed to be impossible. Nuclear engineers have learned from each accident how to improve reactor design so as to diminish the likelihood of that particular accident repeating itself but, as Donald Rumsfeld famously reminded us, there are always “unknown unknowns,” and so each accident has been succeeded by another, unwinding in a way that was not foreseen. The designers of the reactors at Fukushima did not anticipate that the tsunami generated by an earthquake would disable the backup systems that were supposed to stabilize the reactor after the earthquake.

And presumably there are other complicated technological scenarios that we have not foreseen, earthquake faults that are undetected or underestimated, and terrorists hatching plans for mayhem as yet unknown. Not to mention regulators who place too much trust in those they regulate.

Thus it is hard to resist the conclusion reached by sociologist Charles Perrow in his book Normal Accidents: Living with High-Risk Technologies: Nuclear reactors are such inherently complex, tightly coupled systems that, in rare, emergency situations, cascading interactions will unfold very rapidly in such a way that human operators will be unable to predict and master them. To this anthropologist, then, the lesson of Fukushima is not that we now know what we need to know to design the perfectly safe reactor, but that the perfectly safe reactor is always just around the corner. It is technoscientific hubris to think otherwise.

This leaves us with a choice between walking back from a technology that we decide is too dangerous or normalizing the risks of nuclear energy and accepting that an occasional Fukushima is the price we have to pay for a world with less carbon dioxide. It is wishful thinking to believe there is a third choice of nuclear energy without nuclear accidents.

It is unlikely that all countries will make the same choice here. We are probably moving toward a post-Fukushima world in which some countries will abjure nuclear energy while others expand it. Countries with other energy options, strong democratic structures, and powerful environmental movements will probably de-emphasize, and maybe eventually renounce, nuclear energy. Switzerland has already suspended plans to build new reactors, and Germany’s Angela Merkel, responding to large antinuclear protests, announced plans to close seven reactors pending further evaluation of their safety and to reconsider plans to extend the lives of Germany’s oldest reactors.

In the meantime, countries with weak environmental movements and weak regulatory norms seem to be proceeding as if nothing has happened. As the Fukushima nuclear disaster unfolded, Turkey announced plans to go ahead with two reactors, and we can surely expect China, Russia, and India to do the same.

And what of the United States? Will it be like Germany and Switzerland, or like Turkey and China? A good way to think through this question is to look at how the United States responded to its last meltdown — the meltdown of its banking system in 2008. To prevent a future recurrence of this disaster, the US government should have broken up banks that were “too big to fail,” restored the Glass-Steagall Act’s prohibitions on the commingling of investment and depository banks, and moved aggressively to regulate credit default swaps and financial derivatives. It did none of these things because the banks did not want it to, and the banks now run the show.

The US government, including its regulatory agencies, has been largely captured by the corporate sector, which, by means of campaign donations, is able to secure compliant politicians and regulators. (In this context it is not entirely irrelevant that employees of the nuclear operator Exelon Corporation have been among Barack Obama’s biggest campaign donors, and that Obama appointed Exelon’s CEO to his Blue Ribbon Commission on America’s Nuclear Energy Future.)

We have examples from the not-so-distant American past of the government learning important lessons from big mistakes. After the Great Crash, the government reformed the banking system. After the near disaster of the Cuban Missile Crisis, US and Soviet presidents began signing arms control agreements. After the discovery of the Love Canal environmental contamination, Congress passed Superfund legislation.

But we now have a government captured by special interests, paralyzed by partisanship, and confused by astroturfing political groups and phony scientific experts for sale to the highest bidder. Our democracy and our regulatory agencies are husks of what they once were. It is unclear that such a system is capable of learning any lessons or indeed of doing anything much beyond generating speeches and passing the responsibility for failure back and forth like a Ping-Pong ball between our two yapping political parties. While we are distracted by the theater of Congress and the White House, our fate lies in other hands.