Japan is planning to soon release a million tons of radioactive water from the Fukushima power plant. Since the 2011 Fukushima nuclear disaster, water used to cool the crippled power plant has become contaminated, while being kept in huge storage tanks. Advanced techniques of water treatment have removed many of the radioactive substances from this stored water, but one pollutant, radioactive tritium, remains especially tricky to get rid of. Since tritium is a radioactive isotope of hydrogen – a key component of water – it cannot be removed by purification and remains in the treated water.
Currently, tritium-contaminated water is filling Japan’s tanks to the brink and the government has no choice but to release this water in the sea. This decision is fueling numerous controversies surrounding the potential danger of releasing nuclear wastewater in the ocean. It is notably facing stark opposition from Japan’s fishing industry, which has been scrambling to recover ever since the 2011 nuclear meltdowns.
As a social anthropologist working on this disaster, I am less concerned about the scientific debates over the safety vs. danger, and more interested in another type of battle that surrounds this decision: a linguistic one. For instance, when fishermen discuss their concerns, they at times use a specific narrative that accuses the authorities of treating the sea as a garbage dump. On the other hand, state authorities and nuclear organizations like The International Atomic Energy Agency (IAEA), rarely talk about “dumping” wastewater in the sea. Instead, they use words like “release,” “disposal,” or even “dilution.” Words like “garbage,” “pollutants,” “contaminants,” or “waste” are also utterly absent from these expert organizations’ vocabulary. In talking about tritium-contaminated water, for example, IAEA prefers resorting to highly technical terms like Advanced Liquid Processing System-treated water.
These words are not random choices. They reflect highly peculiar ways of governing environmental risks in the aftermath of nuclear disasters. In particular, they echo what scholar Carol Cohn famously called “technostrategic language,” that is, terminologies that disregard particular realities in the face of risks, while preventing the expression of specific values. Cohn first talked about technostrategic languages in the context of nuclear defense intellectuals, arguing that their specific language allowed a rejection of the idea that they too could become victims of the wrath of nuclear weapons.
Similarly, words like “discharge,” “dilution,” or “treated water” are part of governance techniques that have powerful symbolic functions. This language imbues post-disaster narratives with specific values, while shutting out alternatives. Let us examine some of the consequences of this technostrategic language.
First, technical words provide an aura of expertise, legitimacy, and control toward the things that cannot be governed, such as the slow accumulation of tons of contaminated water. There is nothing particularly new about such discursive tactics. For instance, in his book “Rule of Experts,” political scientist Timothy Mitchell argues that the entire politics of the 20th century corresponds to the attribution of human agency over nature, uncertainty, and crises. However, disasters like Fukushima, which past experts ironically described as “unimaginable, unforeseeable, or beyond expectation” (sōteigai in Japanese) precisely show the world that a politics of total control is but a mere illusion.
Second, much like the phenomenon of radioactive decay – a process where unstable atomic elements gradually transform themselves into wholly different elements – bringing discussions of contamination into the technical sphere literally transmutes the narrative of “waste dumping” into what appears to be a sound policy of “treated-water management.” By doing so, unpopular images of nuclear wastewater being “dumped” into the Pacific Ocean are replaced by the trope of scientific scrutiny and vigilance, producing trust toward the authorities, as well as public reassurance. While these terms help to create an illusion of control by managing risk perception, they also cast shadows on polluting practices that don’t just affect Japan, but the whole world.
Third, the use of scientific jargon also creates powerful hierarchical divisions between people and experts. For instance, Japanese fishermen are worried that the release of radioactive water will affect their livelihood. Yet they can rarely compete against the technical lingo of reified expertise. When they voice concerns, they face the real risks of being depicted – and dismissed – as people who don’t understand the science behind this policy. Unfortunately, such a hierarchy is a recurring pattern after Fukushima. Sociologist Aya Kimura demonstrated that state authorities historically delegitimized citizens’ worries about radiation risks, pinpointing them as the anxieties of anti-science individuals.
Following the 2011 nuclear catastrophe, numerous scholars have called to learn from this disaster, often in the hope to ameliorate nuclear safety. This includes proposals for better regulatory frameworks, improved technological designs, or the promotion of novel energy policies. Yet, a call that rarely comes up is the need to rethink the very terms, words, and analogies that are used to make sense of disasters and governance practices. It is a well-known fact that words are powerful shapers of popular culture and imaginary, affecting how reality is perceived and acted upon.
Speaking of “dilution” or “discharge” when referring to the release of contaminated water might appear harmless. However, these words convey the same values that led to a nuclear disaster in the first place: an over-reliance on technological optimism, the pretension of human mastery over the force of nature, and the hierarchical arrogance of scientific bodies.
“The pen is mightier than the sword” is a popular adage of our time. In that regard, it is time to think long and hard about how this figure of speech applies to the governance of nuclear risks.