Thinking Across Disaster




© Springer Japan 2016
Jun Shigemura and Rethy Kieth Chhem (eds.)Mental Health and Social Issues Following a Nuclear Accident10.1007/978-4-431-55699-2_5


5. Thinking Across Disaster



Kim Fortun  and Alli Morgan1


(1)
Department of STS, Russell Sage Laboratory 5114, Rensselaer Polytechnic Institute, 110 8th Street, Troy, NY 12180, USA

 



 

Kim Fortun



Abstract

This chapter describes aspects of the Fukushima disaster that were foreshadowed by other disasters, demonstrating the potential of comparative disaster studies. While acknowledging the way disasters are always unique, emerging from complex drivers that produce cascades of interlaced effects, this chapter highlights recurrent patterns across disaster. This chapter encourages development of comparative disaster literacy alongside development of logistical plans for disaster – so that those involved are able to “read” patterns in disaster as they unfold.

Disasters are always unique, resulting from multiple failures (organizational, technological, educational, etc.), producing cascades of effects (ecological, biological, emotive, conceptual, etc.) – all forcefully shaped by context. Yet there are also patterns across disaster that when recognized can orient disaster mitigation, preparedness, immediate response, and long-term recovery. The Fukushima nuclear plant disaster is no exception. Its tragic unfolding has been both unique and illustrative of patterns that recur across disaster. Building on other chapters in this volume, this chapter will identify dimensions of the Fukushima disaster that were foreshadowed by other disasters, pointing to structural similarities. This chapter will conclude with a call for disaster education that exposes students and professionals across disciplines to case studies that illustrate structural similarities across disasters, enhancing their capacity to anticipate, analyze, and respond to disaster.


Keywords
FukushimaDisaster studiesDisaster education



5.1 Anticipating Disaster


Very early on December 3, 1984, while people still slept, introduction of water to a large storage tank of methyl isocyanate – a component of the pesticide Sevin – led to the total loss of the contents of the tank to the atmosphere, in what is now termed a “worst-case scenario” in Bhopal, India. The toxic gas cloud from the Union Carbide plant moved over the sleeping city, first over the very poorest communities living adjacent to the plant. People awoke thinking their neighbors were burning chili peppers. They had no evacuation plan; most ran into the night – into the plume of gas. At least 4000 died; many claim that the immediate death toll was 10,000 or more. Over a half a million people were categorized as exposed. Many still have health problems today that they attribute to gas exposure or to exposure to continuing pollution from the old plant grounds.

The Bhopal disaster continues to be referred to as “the world’s worst industrial disaster.” But it is not beyond comparison. The Bhopal disaster had a unique constellation of causes and a unique cascade of impacts. Yet there are also patterns that we see again in Fukushima and other disasters. And Bhopal, like Fukushima and other disasters, was shaped by structural conditions.

Across disaster, for example, confidence in a technical resource – pesticides or nuclear power, for example – often undercuts recognition of associated risks. Disaster is insufficiently anticipated. The promise of economic development and food self-sufficiency in India through the chemical-intensive agriculture of the “Green Revolution” laid ground for the Bhopal disaster, for example, undercutting recognition of the risks of producing and storing pesticides. And this promissory logic continues to undercut disaster response. In attempting to retain and encourage foreign investment in India – by companies like Union Carbide – the Indian Supreme Court in 1989 accepted an out-of-court settlement of the Bhopal case that many argue took better care of the company than of gas leak survivors.

Similarly, in Japan, it is clear that the promise of nuclear power as an energy source undercut recognition of associated risks. Hospitals and medical staff in Fukushima prefecture report being almost completely unprepared for disaster, for example. It was assumed that there would never be problems with the nuclear power plants in the region. Deeply institutionalized overconfidence in the safety of nuclear power in Japan was carried and exacerbated by Japan’s media ecology. Public relations researcher Patricia Swann describes how reporting on the nuclear aspects of the disaster had a homogenous tone and lacked substance due to close ties between major Japanese media outlets and TEPCO (Tokyo Electric Power Company), as well as the exclusion of foreign correspondents, freelance journalists, and other “outsiders” not belonging to one of Japan’s “reporters’ clubs” from official press conferences. Swann argues that close ties between industry, government, and media eliminated incentives for reporters to ask challenging questions for fear of losing access to government officials. TEPCO’s 20 billion yen annual advertising budget that benefits the very newspapers that employ these journalists compounded the issue [1].

Tight relations between government and potentially hazardous industries also bias risk assessment in many disaster contexts. Historian Jeff Kingston has written extensively on this in Japan, describing a “nuclear village” that has bound together elected officials, bureaucrats, academics, journalists, and people in the nuclear industry. “This is a village without boundaries or residence cards,” Kingston writes, “an imagined collective bound by solidarity over promoting nuclear energy. If it had a coat of arms, the motto would be ‘Safe, Cheap, and Reliable’.” [2].

Japan’s nuclear village has been sustained by a revolving door between industry and government, referred to with the Japanese term amakudari, or descent from heaven, in which senior government officials migrate to high-paid positions in the private sector, including the nuclear industry [3]. Even after the Fukushima disaster, these ties have continued to bind. Kingston reports, for example, that the Nuclear Regulation Authority, an administrative group established in September 2012 to ensure nuclear safety in Japan, was chaired by a vocal pronuclear expert and comprised of several former members of the now defunct Nuclear and Industrial Safety Agency (NISA) – which came under scrutiny after allegations that the agency influenced public hearings on the use of nuclear energy [2], for example.

The problems of a revolving door between companies and the government agencies responsible for regulating them were also a flash point in the 2010 BP Deepwater Horizon disaster, resulting in the disbanding of the US Minerals Management Service due to conflicts of interests and associated regulatory failures [4]. After the Fukushima disaster, Princeton University physicist and policy expert Frank von Hippel published an editorial in the New York Times arguing that “it could happen here.” Von Hippel said that “nuclear power [in the United States] is a textbook example of the problem of ‘regulatory capture’ – in which an industry gains control of an agency meant to regulate it” [5].

Disaster warning systems also complicate the anticipation of disaster, through the false security that comes with the assumption that these systems will work without fail. Despite having the world’s densest concentration of seismographic monitoring and a population familiar with earthquake drills, Japan did not anticipate the scope of the March 2011 earthquake. For example, the Japan Meteorological Agency’s early warning system functioned properly, but in doing so only alerted those close to the epicenter of the quake, mirroring some of the communication failures seen in the 2010 Chilean tsunami warning system [6]. The system was further challenged by the frequent aftershocks [7]. A misplaced faith in Japan’s newly completed tsunami barriers also undercut anticipation of the devastating flooding that would occur – mirroring the misplaced faith in the levees that failed in New Orleans in 2005 during Hurricane Katrina.

Another early warning system, the System for Prediction of Environmental Emergency Dose Information (SPEEDI), also failed in the wake of the Fukushima Daiichi meltdown. Designed to predict the spread and intensity of radiation release, the system was put into emergency mode at the plant upon TEPCO’s declaration of a nuclear emergency, ready to inform evacuation decisions. What wasn’t anticipated was the loss of power at Fukushima Daiichi, rendering the on-site measuring devices useless. The SPEEDI data, which without power consisted of only the meteorological data predicting which way the theoretical radiation would travel, was not released for several days out of fear of “public confusion” [8]. As in many other disasters, plans for disaster failed to anticipate the multiple kinds of system failures that would almost certainly occur.

The complexity of the technical systems implicated in many disasters also makes anticipation of disaster difficult. Sociologist Charles Perrow pointed to this long ago in his seminal book Normal Accidents (1984). The industrial systems Perrow describes – nuclear power plants, chemical processing plants, and air transport networks – are made of a tangle of technical systems, which are so tightly coupled that it must be considered “normal” to have runaway incidents that exceed what experts can understand much less control [9]. The problem is exacerbated when technical systems are shaken by environmental conditions, as happened in Hurricane Katrina and in the Fukushima disaster. During Hurricane Katrina, one dimension of the disaster was flooding of petrochemical processing facilities, resulting in massive contamination of water and soil [10].

It is also possible to be prepared for the wrong disaster. Anthropologist David Bond provides a compelling example of this in his study of the BP Deepwater Horizon disaster in the Gulf of Mexico in the summer of 2010. Bond reports that after the Exxon Valdez disaster in 1989, the US National Oceanic and Atmospheric Administration (NOAA) committed to being better prepared for ocean oil spills. And they were better prepared in the summer of 2010, when British Petroleum’s Deepwater Horizon platform blew up and began gushing oil into the deep sea. NOAA was prepared, for a surface oil slick as in the Exxon Valdez case – not for a spill gushing up from a mile under the sea, out of control [11]. The operational and conceptual demands were markedly different.

NOAA’s misdirected preparation for the BP Deepwater Horizon disaster points to a recurrent paradox in disaster contexts: the way planning can actually undermine effective response. Immediate response to disaster always involves a multitude of actors with varying fields and levels of expertise. While disaster management plans may establish roles and responsibilities for these actors to assume, disaster rarely unfolds as these plans anticipate. And the very existence of such plans can undermine attunement to actual conditions at hand.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Aug 20, 2016 | Posted by in NUCLEAR MEDICINE | Comments Off on Thinking Across Disaster

Full access? Get Clinical Tree

Get Clinical Tree app for offline access