UDC 629.039.58; 007.51
Junichi Murata – Rissho University, the Department of Philosophy, the Faculty of Letters, Professor. Tokio, Japan.
Japan 141-8602, Tokio, Shinagawa-ku,
4-2-16 Ohsaki, 2nd Building, Room 1108.
Background: There exist three conceptions of technology at least. Firstly, technology is considered to be a closed system constituted only of technological factor in the narrow sense of the term. Secondly, in technology studies a new approach called the social constructivist approach gives an opportunity to look inside technology, which is no longer considered to be a black box. Finally, the so-called actor network theory examining the socio-technical network as constituted equally both of human beings and of various other factors, such as artifacts and natural beings. Such multi-dimensional approach gives a perfect opportunity to understand the nature of modern technology and the reasons of disastrous accidents like that one which we had at Fukushima Daiichi (No. 1).
Results: There is an interactive relationship between technology and its environment. When transferred to some other social or natural environment technology begins working inevitably in some other way. A great complexity of this interdependent system (technology and its environment) naturally gives birth to such accidents which could not be predicted beforehand – thus so-called ‘normal accidents’ occurs. In the case of Fukushima probability of such accidents increases when the so-called ‘safety myth’, which is to believe in safety of nuclear power plants becomes popular. This ‘safety myth’ concerning complex technical systems had a great influence over those who built and exploited Fukushima 1 nuclear power plant.
Research implications: Social determination and imperfect prediction of complex technical systems functioning have their manifestations in different spheres of human activity. These manifestations are clearly seen in the analysis of the most large-scale accidents as those at chemical factories (Union Carbide chemical plant in Bhopal in India), accidents involving space shuttles (Challenger and Columbia), and accidents at nuclear power plants (Nuclear Powers at Three Mile Island, Chernobyl and Fukushima).
Conclusions: Technology cannot be understood as something that can function as a closed system independent of various environmental factors. It always functions as an open system interacting with various factors, which include unknown elements. Philosophy must intervene to break the spell of the ‘safety myth’ and clarify what we can and should learn from Fukushima to reduce the probability of such catastrophes in future.
Keywords: Fukushima nuclear power plant, technology and its environment, ‘normal accident’, actor network theory, ‘safety myth’, philosophy and technology.
About two and a half years ago, at 2.46 P.M., on March 11, 2011, a massive earthquake, measured at magnitude 9.0, hit eastern Japan. It was followed by a giant tsunami, which swept away people, cars, houses, and even a whole community over a vast stretch of the east coast of the Tohoku region.
But, what made the Great East Japan Earthquake truly historic were the accidents at the Fukushima Daiichi (No. 1) Nuclear Power Plant. Japanese officials assigned afterwards to the crisis there a rating of level 7 on the International Nuclear Event Scale, the only nuclear crisis since the 1986 Chernobyl disaster to be assessed so severely.
About two and a half years have already passed since the accident. However, more than 200,000 people are still compelled to remain in an evacuation area and cannot return to their homes. Because of the high level of radioactivity, nobody can enter the area near the epicenter of the accident, and many issues concerning the causes of the accident remain unclear. Last year, four reports on the results of investigations into the causes of the accident were made public by four organizations, including the Investigation Committee of the Government and the Investigation Commission of the National Diet of Japan. However, concerning one of the most important points, that is, the question of whether the earthquake contributed to the disastrous accident before the tsunami struck, the reports arrive at different conclusions.
Given this situation, there remain many things that we can and must discuss. However, in my talk I would like to concentrate on one aspect of the accident at the nuclear power plant, and clarify what we can learn from the case of Fukushima from the viewpoint of the philosophy of technology.
What I would like to emphasize is that technology is not considered a closed system constituted only of a technological factor in the narrow sense of the term, but must be considered as an open system related to and constituted of various factors, including social, cultural, and natural environmental factors. In this sense, technology is to be considered inherently multidimensional.
In my talk I would first like to show that the case of Fukushima teaches us in a negative way how important it is to take seriously the multidimensional character of technology and, secondly, that many causes of the accident, which have since been pointed out, originate in a lack of understanding of this characteristic of technology.
1. Technology and the Environment
Technology is sometimes understood with the image of a machine, which can function properly everywhere independently of the environment where it is used. Given this image, people tend to understand technology as something that stands in contrast to a social, cultural, or natural environment, and has some power that influences and sometimes determines the situation of a social, cultural, or natural environment. Technological determinism, as understood in the broad sense of the term, is one of the most popular ways of understanding technology in philosophy, as well as in everyday life.
The Baconian idea of technology as the power of domination and the use of nature for the benefit of human beings is a popular one, but many conceptions proposed by the great philosophers, such as M. Heidegger’s “Gestell” or M. Horkheimer’s “domination of instrumental reason” can be interpreted in this way. Even in the field of environmental ethics, this way of understanding is dominant, as demonstrated in discussions on anthropocentrism and anti-anthropocentrism. Technology, understood as the power of human beings, is sharply contrasted here with the natural environment.
This way of understanding the conception of technology began to change in the 1970s, when a new approach in technology studies called the social constructivist approach appeared in the sociology and history of technology, and concentrated on a concrete micro-level analysis of the developmental process of technology . The influence of this new approach was not restricted to the fields of sociology and history, but extended to discussions on the philosophy of technology.
On the basis of this social constructivist approach, philosophers are now able to look inside technology, which had long been closed and was considered to be a black box, and find there not only technological factors but also various social factors, which relate to economics, politics, culture, and values. As the social constructivist’s analysis of the developmental process of bicycles impressively shows, the developmental process from technological design and production to the use and diffusion of technological artifacts is not determined by a single factor of technological rationality or efficiency, but is open to various factors originating in various fields. For example, in the first phase of the developmental process of bicycles at the end of the 19th century, the design underwent a great change from the first popular model of the Penny farthing, which was preferred mostly by young men, who enjoyed high speeds, to the present model, which everyone, including women, uses in everyday life. In this way, the meaning of a bicycle is constituted by multiple factors, and in this sense technological artifacts must be considered as having interpretative flexibility.
On the other hand, we must be careful in that social constructivists do not claim society determines technology in the reverse way to technological determinists, who claim that technology determines society. Rather, it is emphasized that a society without technology is impossible just as technology without a society is unthinkable. Technology and society are two sides of the same coin and constitute a complex system that can be called a socio-technical system or a socio-technical network.
This characterization of technology can be found also in the so-called actor network theory, which illustrates this characteristic more clearly. According to this view, the socio-technical network is constituted equally both of human beings and of various other factors, such as artifacts and natural beings.
On the basis of this way of seeing the relationship between technology and various other factors, I would like to emphasize the interactive relationship between technology and its environment. While technology influences and constitutes the social, cultural, and natural environment, various environmental factors influence and constitute how technology is developed and realized. Technology and environment are interdependent and closely connected in multiple dimensions.
If we take this interactive relationship between technology and its environment seriously, we cannot simply say that a technology, which is transferred from one environment to another, remains the same. Indeed, as Lynn White, a historian of technology, explained, while in the late Medieval period of Europe windmills became important as power plants, “In Tibet windmills are used only thus, in the technology of prayers; in China they are applied solely to pumping or to hauling boats over lock-sides, not for grinding grain; in Afghanistan they are engaged chiefly in milling flour” [5, p. 86]. Neither can we simply say that nuclear power plants constructed on a firm and stable ground and those constructed in an environment where earthquakes occur frequently are the same technology. Without serious works of translation, no technology can be successfully transferred from one environment to another.
With regard to this multidimensional character of technology, the process of introducing the power plant in Fukushima should be regarded as fundamentally problematical. In the mid-1960s, the first unit of the Fukushima Daiichi power plant, which was originally developed by General Electric (GE) in the US, was introduced mainly at the initiative of GE. Through the introduction process, an earthquake-resistant design was added, considering earthquake standards in Japan at that time. But, as the Investigation Commission of the National Diet indicates, it is questionable whether this additional design was sufficient. In any case, although various measures were taken to improve resistance to earthquakes and tsunami, the basic framework of the original design was never questioned. That means people continued to think that the core technology of a nuclear plant can function independently of environmental factors.
In this respect, it is interesting that the Investigation Committee of the Government pointed out the lack of a complex disaster viewpoint as the fundamental cause of the accident.
If we take this complex character of a disaster seriously, in general, disasters cannot simply be differentiated as man-made disasters and natural disasters. Just as the concept of technology is multidimensional, the concept of an accident and a disaster must also be considered to be multidimensional.
2. Technology and the Accident
If a socio-technical network is multi-dimensional and results in complex behavior that is determined neither by technological nor various environmental factors alone, this network shows the characteristic of under-determination, because there is no guarantee that it constitutes and maintains a harmonious and stable unity under various circumstances. It is well known that technology always brings about unintended consequences during the process of its development and use. This characteristic becomes conspicuous in the process of technology transfer, but it can generally be seen everywhere technology is developed and applied. Edward Tenner expresses this unpredictable and unmanageable aspect of technology with the interesting expression “technology bites back” . While this character can sometimes be regarded as an origin of creativity, it is at the same time the origin of failures and accidents involving technology.
The clearest cases are various large-scale accidents involving high-risk technological systems, such as accidents at large-scale chemical factories (Union Carbide chemical plant in Bhopal in India), accidents involving space shuttles (Challenger and Columbia), and accidents at nuclear power plants (Nuclear Power at Three Mile Island, Chernobyl, and Fukushima). Charles Perrow, a sociologist, claims that complex technological systems, in which many factors are inseparably and closely connected, always raise the possibility that accidents will occur in an unpredictable, inevitable, and incomprehensible way, and calls them “normal accidents”. Nowadays, we are confronted with various problems related to accidents of this kind. It is already almost 30 years since Urlich Beck proposed the provocative concept of a “risk society” to illustrate our present situation.
Until now, there have been various attempts, including Perrow’s normal accident theory, to explain and understand this kind of accident. However, we must be careful here, because to understand “normal accidents” means none other than understanding something that includes un-understandable factors»; and in this sense, an attempt to understand accidents of high-risk technologies implies something paradoxical. I think this is one of the reasons we should separate an investigation of the causes of accidents from the (legal) question of who is responsible for an accident.
If we are obliged to determine who is responsible, we must explain the process of occurrence with a precise causal relation, which must be understandable as if it were predictable and evitable. But, this causal story has meaning only in hindsight. In contrast, people confronting an accident in real-time must judge everything in an uncertain situation without hindsight to help them. In this sense, the post hoc causal story cannot but neglect and eliminate incomprehensible factors; therefore, it is difficult to learn from it and acquire helpful hints for the future.
Understanding a “normal accident” is nothing less than understanding it as an occurrence that includes some incomprehensible factors. Perhaps you might think this characterization is dubious. But, this paradoxical characteristic of technological accidents has been well known for a long time. At the beginning of his Dialogue concerning Two New Sciences, Galileo Galilei showed impressively that events happen contrary to expectations; in particular, a precautionary measure can have a disastrous result [2, p. 5]. Recently, the Report of the Columbia Accident Investigation Board of NASA indicated that changing explicit rules and institutional structures is not sufficient to avoid future accidents, because the changes always have the potential to produce new risks. According to the Report, what is necessary is a fundamental change of attitude in the cultural dimension. According to NASA’s Report “The [Space Shuttle] Program must also remain sensitive to the fact that despite its best intention, managers, engineers, safety professional, and other employees can, when confronted with extraordinary demands, act in counterproductive ways” (Report 2003: 181).
“Organizations that deal with high-risk operations must always have a healthy fear of failure – operations must be proved safe, rather than the other way around.” (Report 2003: 190)
These sentences suggest clearly where we should search for the resources to avoid future possible accidents. They are not found in ethics in the narrow sense of the term, because the “best intentions” people might have cannot contribute to preventing failures. Rather “sensitivity” to possible accidents and “a healthy fear of failure” must play a decisive role. This recommendation of the investigation board of the Columbia accident seems to show that it accepts the paradoxical character of “normal accidents” and finds a possible way to respond to this paradox. If unpredictable accidents are inevitable with technological systems, it is not sufficient to consider only explicit factors within the systems. Rather, we must take into consideration unknown factors that lie outside the explicit and rational understanding of the systems. The case of Fukushima shows how difficult it is to understand these circumstances. Confronted with the disastrous results of the severe earthquake, tsunami, and nuclear power plant accident, many experts used the phrase “beyond assumption” or “beyond prediction,” meaning that what happened could not have been predicted.
In this respect, the disastrous accident at Fukushima Daiichi (No. l) power plant is to be considered a typical case of a “normal accident.” It can be characterized as “normal” because only after the accident was it made clear that the people who were involved in the development and control of the power plant continued to think that the multiplex protection system of the power plant was sufficient to avoid a possible severe accident, and did not learn from the accidents at Three Mile Island and Chernobyl. Until the accident occurred, they continued to assume that severe accidents at nuclear power plants would never occur in Japan, and missed the chance to improve safety measures when various recommendations and criticisms were made from various fields.
To illustrate these circumstances, almost everyone began to use the term “safety myth,” which is to believe in the safety of nuclear power plants without taking seriously the possibility of accidents, and which ensnared people related to nuclear power plants for a long time until the accident occurred.
The myth played an important role in the history of nuclear power plants in Japan. For example, in the mid-1990s, an unexpected accident occurred at the Monjyu fast-breeder reactor in Tsuruga, Fukui prefecture, which had a serious impact on the public at that time and brought about their mistrust of nuclear technology. In the face of this situation, the Nuclear Safety Commission issued a White Paper, in which it emphasized the need to recover people’s trust in nuclear technology, and spread and establish anshin (feeling of safety) among the public. They did not think that it was more important to improve the safety of nuclear technology itself, because they thought if they attempted to improve safety measures people would tend to think that nuclear technology had not been sufficiently safe, increasing their mistrust of the technology. Not improving safety but promoting a feeling of safety among the public was considered important to establish nuclear power plants in Japan. In fact, since then the word anshin has come to be used very often in public, not only in the nuclear technology field but also in other fields. This story shows one of the typical ways the “safety myth” played a social and cultural role in Japan.
What is important and surprising now is that this perverse structure related to the “safety myth” seems to remain after the accident. Even now, we still hear people often use the word anshin (feeling of safety) together with anzen (safety) in various fields, as if saying anshin was as important as anzen itself. I think this is exactly where philosophy must intervene to break the spell of this myth and clarify what we can and should learn from Fukushima.
I think all of this indicates that we need to change the way we understand the meaning of technology. Technology cannot be understood as something that can function as a closed system independent of various environmental factors; rather, it always functions as an open system interacting with various factors, which include unknown elements. Just as the question of how to respond to others is of prime importance in all of the ethics of human beings, responding to “the other” is the central task of technology.
1. Bijker W., Pinch Т., Hughes Т. The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. Cambridge, Massachusetts, The MIT Press, 1987.
2. Galileo G. Dialogue Concerning Two New Sciences, translated by Crew H., de Salvo A. New York, Dover Publications, 1914.
3. Perrow C. Normal Accidents, Living with High-Risk Technologies. Princeton, PrincetonUniversity Press, 1999, 386 p.
4. Tenner E. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York, Alfred A. Knopf, 1996, 346 p.
5. White L. Jr. Medieval Technology and Social Change. Oxford, OxfordUniversity Press, 1962.
© J. Murata, 2013