The Whale and the Reactor

Langdon Winner. 1986. The Whale and the Reactor: A Search for Limits in an Age of High Technology. University of Chicago Press.


This book is an examination of technology from the perspective of social and political action. He outlines it explicitly as a critique, also purposefully pointing out that critiques of technology have not been understood as having the positive intent they are meant to carry, being labelled as "anti-technology." His major question is in searching for the limits of technology, where we will decide to draw the line.

  • Part I: "How can one look beyond the obvious facts of instrumentality to study the politics of technical objects? Which theoretical perspectives are the most helpful?" (pg. x).
  • Part II: Modern social movements that have chosen a technology to center in both hopes and fears and the opportunities and pitfalls of centering such technology. An examination of the romanticization of technology as a democratizing power.
  • Part III: The politics of language as applied to technology, which he views as narrow and "conceptually impoverished" (pg. x). "What would it take to open up the conversation about technology to include a richer set of cares, categories, and criteria?" (pg. x).

Part I: A Philosophy of Technology

Chapter 1: Technologies as Forms of Life

Winner begins his critique by noting the lack of engagement with any philosophy of technology or the basic issues of technology being well-defined, despite booming interest in it. He writes that there is, as of 1986, no philosophy of technology. He states that "despite the fact that nobody would deny its importance to an adequate understanding of the human condition, technology has never joined epistemology, metaphysics, esthetics, law, science, and politics as a fully respectable topic for philosophical inquiry" (pg. 4). A philosophy of technology is meant to critically examine the nature and significance of technology to humans. He further criticizes engineers for seeming "unaware of any philosophical questions their work might entail" (pg. 4).

He posits the idea of "progress" as holding back a philosophy of technology, where society has adopted an "unflinching confidence" in technological development for the human benefit. The general, taken-for-granted approach to technology has been a focus on use. Winner argues this is accompanied by "a range of moral context" such as tools being used well, poorly, for good, or good bad (pg. 6). He argues that the "promiscuous utility" or technologies has led to them being regarded as fundamentally neutral. Yet, human history has shown that technologies are not simply tools, but can reshape human society and behaviors. He describes how social science has tried to "awaken the sleeper" of "technological somnambulism" through technology assessment, but this has tended to view technology as the "cause" and everything else as the "effect" (pg. 10). He believes this ascribes a forceful adaptation onto humans, that their destiny is to adapt to technological changes. He states that against the "empirical and moral shortcomings of cause-and-effect models" is the recognition of changes in human behaviors taking place /as/ technology is built. He states that these are "new worlds being made" (pg. 11). He refers to technologies as "forms of life," not simply tools, but cultures that become "woven into the texture of everyday existence" (pg. 12). In some cases, technology has "added fundamentally new activities to the range of things human beings do" and has "begun to alter the very conditions of life itself" (pg. 13). His perspective follows Marxist materialism, in which our interactions with and changing of the shapes of material things also changes us.

However, he critiques both Marx and Wittgenstein, whom he has referenced throughout this chapter, as providing only passive philosophies for technological examination. Wittgenstein accepts forms of life as "a given," similar to Winner's critique of technological somnambulism. Marx and Engels rejected influencing technological development, believing it to be inherently good and encouraged. Technological development exists alongside and to encourage the inevitible shift from capitalism to socialism.

Winner questions: "As we 'make things work,' what kind of world are we making?" We must assess not only the tools, but "the production of psychological, social, and political conditions as a part of any significant technical change" (pg. 17).

Chapter 2: Do Artifacts Have Politics?

Winner opens the chapter discussing Lewis Mumford's two political technologies: the first authoritarian, system-centered, powerful but unstable; the second democratic, human-centered, weak but resourceful and durable (pg. 19). Interpreting technologies through political language is, despite the insistence technologies are neutral, not new. They are often portrayed as great democratizers, enabling social justice and freedom. Even while technologies are interwoven into human life, including politics, the technologies themselves are viewed as apolitical, and viewing them as political is positioned as ignoring the true source of value: humans. Consistently, discourse suggesting that artifacts have politics is met with criticism, which suggests that politics are only sourced from the social and/or economic system a technology is embedded. While Winner finds this wise in terms of acknowledging the social systems engaged in technology, not simply cause-and-effects, he criticizes this view for acting as if "technical things do not matter at all" (pg. 21).

Instead, he proposes a theory of technological politics that focuses on the characteristics of technologies and what they mean - as political phenomenon. He argues this does not replace the social deterministic model that suggests all values come from society, but compliments it. By politics, Winner is referring to relationships or power and authority.

He highlights two ways artifacts contain politics: (1) Instances where a technology is used to settle an issue within a specific community; and (2) technologies that are most compatible with specific types of political relationships.

1. In the first example, Winner discusses technologies built with a specific social intent in mind, in which those technologies often outlive their creators and continue shaping society with that intent. These technologies "precede the use of the things in question" (pg. 25). In other words, oppression or authority or power are not simply effects of a neutrally designed technology, but specifically designed and built to produce a specific set of consequences even before it is used by a human being. Further, a technology may have a general use (e.g., Moses' bridges for automobile access) but have other purposes beyond their immediate and general use (e.g., the bridges purposefully barring buses). Technologies also do not need to be designed with a specific malicious intent in mind to be harmful (e.g., the technologies that are ignorantly designed to be inaccessible). Winner states that all technologies are designed to favor some social interests over others. He argues that technologies are similar to legislation and public frameworks in that flexibility vanishes once commitments are made to a specific design.

2. In the second example, Winner discusses inherently political technologies, that some technologies could be, in their very nature, political. Certain technologies are so inflexible that choosing them is "to choose unalterably a particular form of political life" (pg. 29). Numerous theorists, from Engels to Plato, have discussed the ultimate authority of certain technologies, like that of a ship. Running a ship requires authority, it is not a democratic process, and it necessitates people to act in specific ways. A ship is a technology that requires "central rule and decisive action" much like government (pg. 31). He highlights two ways technologies can be viewed as inherently politics. (1) "The adoption of a given technical system actually requires the creation and maintenance of a particular set of social conditions as the operating environment for that system" (pg. 32). The technology can only exist given certain political circumstances also exist. (2) A weaker version of (1): A given technology is compatible with, but does not require, specific political circumstances. Some technologies accommodate some politics more than others, but other types of political structures can still exist alongside them. Technologies that are inherently political require specific human behaviors, "the intractable properties of certain kinds of technology are strongly, perhaps unavoidably, linked to particular institutionalized patterms of power and authority" (pg. 38). The choices in designing something and deploying it solidify the type of impact it will have. And interventions by alternative social systems cannot alter the technology's political effects.

He states that both 1 and 2 can overlap and intersect - a technology can be flexible in some ways but rigid in others. He feels it is important to closely examine the politics of artifacts given how willful people are to change their behaviors and lives to accommodate new technologies, despite also resisting such changes on purely political grounds.

Techne and Politeia

Here, Winner argues that all technologies "must be scrutinized to see whether they are friendly or unfriendly to the idea of a just society" (pg. 40). He lays out Plato's considerations of politics as technology, "techne" as a type of strategic crafting of "politeaia," the order of human relationships within a state. In Plato's works, techne serves as a model for politics, but not vice versa; Winner states that since, techne has become politea, technologies shaping society and solely not vice versa. He writes how the Industrial Revolution in the U.S. and Europe began competing with political insitutions for power, authority, and loyalty. Historically, philosophers like Plato have been concerned about the mixing of both civic virtue and materialism. Technical innovation should be approached with caution because it is antithetical to civic virtue.

He contrasts this with the rise of republican political thought and capitalism, particularly in the United States, where self-interest and materialism began being posited as the key to civic virtue. Abundance and freedom became equated values, and the drive to generate wealth welcomed new technologies as a blessing. At the time, material in the U.S. was viewed as so abundant that class conflict should not arise - there was enough to go around. New technologies were embraces as automatically contributing to a good society. The key was in increasing efficiency, so that one society does not lag behind its global competitors.

He traces how industrial production led to specific insitutional patterns that have largely placed power and social control into large corporations, bureaucracies, and the military. More people began to work in huge technology-based institutions than ever before. Given the desire to do things in the most efficient and productive manner, humans began to operate in a bureaucratic manner that established highly structured relationships of command. He calls the workplace "undisguisedly authoritarian" (pg. 48). He argues that as religious and traditional hierarchies crumbled, technology offered a way to restore hierarchy - a "godsend for inequality" (pg. 48). Large organizations now exercise power to control social and political interests that are meant to control them. A synthesis of social and political and technical phenomenon have influenced these societal shifts.

He argues that, unlike politics, "political wisdom ... [is], by and large, missing in those who design, engineer, and promote vast systems" (pg. 50). Instead reigns an obsession with the quest for profits, organizational control, and simply enjoying innovation, with no interest in how their work might shape the structure of society or justice.

He lists five concerns for which the public actually considers limiting technology. (1) It threatens health or safety; (2) It threatens to exhaust a vital resource; (3) It degrades the environment; (4) It threatens natural species and wilderness' (5) It causes "exaggerated" social stresses/strains. While he feels these are worthwhile concerns, they "severely restrict the range of moral and political criteria that are permissible in public deliberations about technological change" (pg. 51). He views the question of how "society can establish forms and limits for technological change" based on a desire for "a positively articulared idea of what society ought to be" as a challenge of utmost importance (pg. 52).

He believes that if we establish technology as a political exercise, then we can identify each political commitment of that technology to see the choices we must make about social and political life. The age of high technology must also ask "What is the best form of political society?" (pg. 52). "Because technological innovation is inextricably linked to processes of social reconstruction, any society that hopes to control its own structural evolution must confront each significant set of technological possibilities with scrupulous care" (pg. 54).

Part II: Technology: Reform and Revolution

Chapter 4: Building the Better Mousetrap

The concepts of "appropriate," "intermediate," and "alternative" technologies were proposed in the 1960s as means for addressing the "issues" facing "Third World" countries. In the 1970s, such technologies were also proposed as solutionist to the problems of industrialized societies. If the right kinds of technologies were used, society could be cured of its ills. In this chapter, Winner questions what makes a technology "appropriate" - for who and for what? Particularly, in the cases of industrialized nations, it had seemed their culture and norm had led them to these social ills. So an appropriate technology would have to challenge these norms.

Winner points out that the rising number of institutes, businesses, governments, and researchers claiming to develop "appropriate" technology did not share a common ethos. Some viewed appropriate technology as compatible with the capitalist system; others embraced a Marxist perspective. There arose ideology around "soft" (e.g., small energy inputs, ecologically sound, democratic, etc.) versus "hard" (e.g., high pollution rates, large energy input, destructive to cultures and species, etc.) technologies. Winner argues these typologies were destined to fail because the necessary characteristics were not necessarily compatible within one system.

The obsession with New Age appropriate technology ended in 1980 with the election of Reagan, as all non-military economics moved back to the private sector.

Chapter 5: Decentralization Clarified

The notion of decentralization is to take enormous centralized insitutions and break them into smaller parts that are more accessible to democratic control. Calling for decentralization has been a consistent theme among appropriate technologists and those disenchanted with the perceived destructiveness of sociotechnical organizations. Winner argues this term is vague and overly used. This chapter is dedicated to clarifying what decentralization means.

He breaks down the linguistics, which seem to be contrary. De- meaning undoing, -ization meaning becoming, central indicating some unspecified center. "Undoing the process of becoming central" (pg. 86). To centralize or decentralize something, one first has to indicate the activity or behavior that is an issue, to locate the center of the problem. One has to determine how many centers there are, where they are located, how much power they possess, and their cultural diversity. The number of centers could be one or many, and deciding how many are the number to decenter depends on the issue. The location of the center may be in one geographic place, many geographic places, or a social place (e.g., an influential figure). Then there is the problem of whether those places are accessible or not. Who has power shapes who and what has social, economic, and political control over de/centralization. Will decentralizing promote a diversity of culture or a vitality of a space?

Winner posits this as an interesting concept for technology, around issues of ownership, sources, conditions of production, distribution, and consumption, and use of goods and services. Technology companies, while they may not own or produce the majority of the world's or a country's specific technology or good, are still often highly centralized within themselves. What economic and efficiency tradeoffs are people willing to make for decentralized technology companies?

Winner also argues that technologies themselves are centralized. The most commonly used everyday objects we do not know how to make, control, or repair. Many artifacts come from the same centers of production and distribution. Alongside technological "progress," people find themselves reliant on centralized systems they have little power to influence. He states that "to decentralize technology would mean redesigning and replacing much of our existing hardware and reforming the ways our technologies are managed" (pg. 96).

Chapter 6: Mythinformation

In this chapter, Winner questions the claims of an Information Revolution due to the increasing utility of computers. He critiques, in particular, technocrats, journalists, and the public who care only about what's new and positive about computers, but not what could go wrong with them. He explores the ignored questions of such a revolution, like how it migth shift the locus of power, or whether a computer revolution would be committed to specific social ideals, or whether it will promote class privilege. He feels the struggle among technologists to keep current and novel prevents them from thinking critically about the implications of their inventions. He laments the lack of engagement with any real, thorough analysis of the Industrial Revolution in comparison to the Information Revolution. Instead, those touting a computerized revolution take an ahistorical perspective.

The promise of the computer is universal access to valuable information, class erosion, evaporating difference, and a democratic and egalitarian society. This is particularly fascinating to read now, in the age of mis/disinformation, class divide, filter bubbles, extremism, and conspiracy theories. He calls these utopian beliefs "mythinformation," the "almost religious conviction" that computers would produce a better world for all. He contradicts many of the utopian views, such as technocrats views on computers altering social power, by stating they misrepresent the direction in which power will be shifted. Computers benefit large corporations, and the majority of jobs growing are menial service tasks with low wages. He writes: "Current developments in the informationage suggest an increase in power by those who already had a great deal of power ... an augmentation of wealth by the already wealthy" (pg. 107). He predicted that the computer revolution would most likely have a conservative character. He posits the only way to ensure a democratic, decentralized equity would be through society carefully deciding how to design systems towards that goal.

He feels that computer romantics wield four assumptions: (1) people lack information; (2) information is knowledge; (3) knowledge is power; (4) more access to information increases democracy and equality. He states "the formula information = knowledge = power = democracy lacks any real substance" when analyzing the current abundance of information but lack of knowledge, power, and civic participation in society (pg. 113). He calls the mythinformation an ideology, in terms of a belief system, which values information even above knowledge. Mass amounts of data are viewed as beneficial, despite the challenge of intelligibility. He questions the supposed "need" for information "in a social world filled with many pressing human needs" (pg. 114).

He then thinks through the potentially negative outcomes of all this information, not solely as a "misuse" of a system but as a design of the system itself. The computer revolution "may achieve wonderful social conveniences at the cost of placing freedom, perhaps inadvertently, in a deep chill" (pf. 115-16). He also expresses concern over the lack of spatial and temporal barriers of computers and telecommunications, given that humans have traditionally found meaning in specific spaces and time. He discusses how technology corporations gain the power to move elsewhere, not committing to any one locale, forcing those locales to negotiate with them to keep them in place.

Part III: Excess and Limit

Chapter 7: The State of Nature Revisited

In this chapter, Winner engages with the concept of nature - the idea that one way forward may be natural or unnatural, depending on the perspective. He states that no society has ever decided nature to mean "the totality of all things," instead "select[ing] particular features for emphasis, endowing them with esthetic, moral, and political significance" (pg. 121). Often, versions of nature are entirely contradictory to one another. He outlines in the majority of the chapter three ways Nature has been viewed: (1) Nature as a stock of economic goods; (2) Nature as an endangered ecosystem; (3) Nature as a source of intrinsic good. In the end, he agrees with Lukacs that nature if a social category which is the totality of things interpreted with meaning.

Chapter 8: On Not Hitting the Tar-Baby

Risk assessments are the most common ways of limiting technology. Winner believes that if we define risk broadly as "everything that could go conceivably wrong" then we could develop morals and norms to guide technical limits. However, "the promise of risk assessment is difficult to realize" (pg. 138). Methods for assessing risk can have high stakes, like assessing the potential risk of nuclear power. Determining the threshold of safety and the threshold of chance of harm are also decisions to be made. Further, there are social, political, and economic interests who might shape what risk and safety look like. Risk assessment, in practice, tends to lean into the status quo.

He argues that framing dangers as risks redefines the importance of those dangers. It adds a layer of uncertainty to the danger, for example, between the cause and effect of pollution and cancer. How certain are we there is a link? What is the threshold for pollution to cause cancer? The issue becomes about relative size and certainty of harm. Winner states that risk is scientifically measurable, and is therefore subject to the constraints of methodologies and rigor. Then, when unable to find the risk relationship between a thing and its harms, "norms that regulate the acceptance or rejection of the findings of scientific research become, in effect, moral norms governing judgments about harm and responsibility" (pg. 143). When uncertainty about a risk exists, often things - inventions and their uses - carry on as normal until there are "better" research findings.

Alongside risk is often cost, a relationship perceived as the tradeoff between risk and cost. How much is worth spending to mitigate risk? Is the cost worth the benefit? Who pays for reducing risk? If invention of technology is a benefit to the capitalist market, then risk is always framed as a cost. Harm is weighed against possible gain. He criticizes the weighting of two risks against one another. For example, the argument it is irrational to fear the risks of nuclear energy when one drives a car, a technology with high injury and death risk. Such comparisons are used to argue against fears of new technologies, that "normal folk are able to overcome such phobias by reminding themselves of the incalculable good that modern technologies have brought" (pg. 146). Risk-taking is often appreciated and encouraged in a culture that values economic risk.

Winner writes: "the risk debate is one that certain kinds of social interests can expect to lose by the very act of entering" (pg. 148). In other words, how risk is framed currently disadvantages social interest. A market of production and consumption is valued above those interests. By choosing to highlight risk, he argues that culture "tacitly accept[s] assumptions they might otherwise wish to deny ... that the object or practice that worries them must be judged in light of some good it brings" (pg. 149).

He proposes some possible solutions to the risk debate. One might be to take all of the issues involved in public health and safety and assess what standards, methods, models, and findings are most appropriate. Similarly, one might point out the shortcomings of the cost/benefit analysis by questioning the validity of measuring cost and benefit. If social issues are misdefined by the cost/risk analysis, then they should be resisted.

Chapter 9: Brandy, Cigars, and Human Values

"We have come to the last resort. The search for resonable moral limits to guide technological civilization" (pg. 155). He writes that "nature" is too ambiguous for guidance, and "risk" associated with a gamble, so here he discusses "values." Yet, he acknowledges that values is "seen as a symptom of deep-seated confision, an inability to think and talk precisely about the most basic questions of human well-being and the future of our planet" (pg. 156). He differentiates values from "cares, commitments, responsibilities, preferences, tasks, religious convictions, personal aspirations, and so forth" (pg. 156). Value has been defined as the worth of something in early political economist theories, and the sum total of principles by philosophy. Empirical science and the decisions made in conducting it are also value-laden, meaning facts and values cannot be separated, even though value has become synonymous with a subjective phenomenon. He writes that "the shift in the use of this term from an objective to a subjective meanign is strongly linked to a change in how we view our situation" (pg. 158). It is less about thinking through the quality of things or societal conditions and more about assessing someone's internal, individual emotions and thoughts. He feels the most important consequence of this shift is no longer investing in shared reason for action. A cure for the vague and hollow term "values" would be to focus on something intentionally more specific, whether we really mean norms, preferences, motives, etc.