Occam's Nuclear Plant

Three Mile Island (TMI) was the worst nuclear powerplant disaster in US history. At about 4 a.m. on March 28, 1979, the reactor began to lose large amounts of coolant due to a valve that was stuck open. A few hours later, radioactive gasses and iodine were released into Harrisburg, Pennsylvania. No one was killed, but damages totaled $2.4 billion USD.

Normal Accidents

This disaster was the main case study in Charles Perrow’s book Normal Accidents. He sets out a theory that in certain systems, accidents are unavoidable – normal. These systems have three defining characteristics:

  • Complex

  • Tightly Coupled

  • Catastrophic Potential

In other words, accidents are normal when a system has many parts which automatically interact. Plus, the downside is large enough that people can notice when something bad happens. The reactor at Three Mile Island, for example, has hundreds of valves, switches, personnel, computers, fans, alarms, and instruction manuals. 

Aaron Woldvasky points out a few ways in which these many tightly coupled parts interacted in a negative and unexpected way, making the Three Mile Island disaster worse. For one, employees followed the wrong instruction manuals when trying to stop the meltdown. And expert personnel were delayed in entering the plant due to strict visitor screening protocols.

In isolation, creating instruction manuals and visitor screening protocols are both reasonable safety precautions. But in a disaster scenario, these precautions were unsafe. The moral of the story of both Normal Accidents and Searching for Safety is twofold. First, more safety measures don’t mean increased safety (because they increase complexity). Second, system flexibility – i.e. ‘loose coupling’ – is an important defense against disaster.

Le Paradoxe

So, why do systems continue to get more complex and tightly coupled? It might be the fault of the everyday person, rather than the nuclear operators or nuclear regulators. Rene Amalberti, a French safety researcher, put it well:

"In fact the level of safety has the surprising property that it is never adequate and it actually generates societal demand which increases in parallel with the progress that is made."

For some reason, we continually demand regulators to prove that we’re safe. But since safety is not a thing itself, but an absence of things, it’s difficult to prove. There is an “Inability to get credit for improving safety…”. 

So, regulators add inspections, standards, requirements, etc. to the mandate of operators. They add parts to the system, which makes the system more complex. Operators seek to make the system more efficient by grouping or automating these new parts. As a result, the “search for safety” begins to backfire.

Atomic Renaissance

With the nuclear industry having a renaissance moment, it’s good to keep in mind that it’s simultaneously true that i) all accidents are preventable, but ii) we should not try to prevent all accidents. Three Mile Island resulted in no deaths. We faced a similar lose-lose dilemma with coal, back in the 1700s. 

“Coal’s pollution may have been killing them slowly, but a lack of heat would have killed them quickly.”

Then, the choice was between freezing to death and pollution. It was an obvious choice at the time. Today, the choice might be between CO2 emissions and the occasional nuclear reactor scare. We’re safer than we seem, especially by historical standards. It’s hard not to take that for granted, but it feels good once you do.

  • I had an esoterically viral tweet.

  • What it means to “be glue”, how to hold a team together, and how to get credit where it’s due.

  • On the need for ‘leeway’ in a system of rules, by analogy of the game of cricket.

References

  1. Three Mile Island Wikipedia

  2. Searching for Safety by Aaron Woldavsky

  3. Navigating Safety by Rene Amalberti

  4. Coal: A Human History by Barbara Freese

Roots & Rooks logo
Subscribe to Roots & Rooks and never miss a post.