For the sake of argument, let’s assume that what the scientists are saying about global warming – that we are headed for all manner of catastrophic changes in the environment unless fossil fuel emissions are drastically reduced, immediately – is accurate.
Also for the sake of argument, let’s assume that the world’s political leaders and the citizens they represent are sane, and that, therefore, they would like to avoid those catastrophic changes in the environment.
Assuming both propositions to be true, it would seem reasonable to ask ourselves whether it’s possible to take the necessary actions that would forestall those changes. In order to answer yes to that question we will need to overcome a series of challenges that can collectively be described as technological autonomy.
Technological autonomy is a shorthand way of expressing the idea that our technologies and technological systems have become so ubiquitous, so intertwined, and so powerful that they are no longer in our control. This autonomy is due to the accumulated force of the technologies themselves and also to our utter dependence on them.
The philosopher of technology Langdon Winner refers to this dependence as the “technological imperative.” Advanced technologies require vast networks of supportive technologies in order to properly function. Our cars wouldn’t go far without roads, gasoline, traffic control systems, and the like. Electricity needs power lines, generators, distributors, light bulbs, and lamps, together with production, distribution, and administrative systems to put all those elements (profitably) into place. A “chain of reciprocal dependency” is established, Winner says, that requires “not only the means but also the entire set of means to the means.”
Winner, whose book Autonomous Technology is the seminal study of this issue, also points out that we usually become committed to these networks of technological systems gradually, not realizing how intractable our commitments will become. He calls this “technological drift.” As we invent and deploy powerful technologies for specific purposes, Winner adds, they create ripple effects that radiate unpredictably out into the culture. These influences generate a variety of unintended consequences, many of them virtually impossible to control.
In an earlier Cyborgology essay , PJ Rey pointed out that as citizens of a technological society we go about our daily business placing a significant degree of faith in the technological devices and systems we use. Faith is necessary because most of us don’t have the slightest idea how those devices and systems actually work, and we certainly wouldn’t know how to repair them if they fail. We trade, Rey said, certainty for convenience. In the process we also surrender a substantial measure of control of those devices and systems.
The historian of technology Thomas P. Hughes has pointed out that our deepening commitment to existing systems is psychological as well as practical, and that it applies as much to the people who make technological systems as to the people who use them, if not more so. “Technological momentum” is the term he coined to describe this tendency to habituation. It’s a tendency that high tech companies like Google try desperately to avoid, regularly pronouncing their determination to retain the flexibility of start-ups. The regularity with which those promises are made suggests the tenacity of the problem.
Two other dynamics that contribute substantially to technological autonomy should be mentioned. Technological convergence describes the merger of previously disparate technologies into new combinations. Technological diffusion describes the spread of existing technologies into novel, often unanticipated applications. The power of technological convergence can be seen in the joining of computers with everything from television and telephones to surgery and genetics. Technological diffusion can be seen in the spread of assembly line techniques from the manufacture of automobiles (inspired in part by the disassembly of animals in meat packing plants) to the manufacture of hamburgers. Franchising represents the extension of the assembly line concept to the manufacture of business empires.
Individually each of the dynamics I’ve named here would be difficult to restrain. Collectively they constitute a forward motion that is irreversible. I call the consequence of this collectivity “de facto technological autonomy.” By that I mean that although we can theoretically detach ourselves from the technological systems on which we’ve come to depend, practically such a detachment is impossible because it would create unsupportable levels of disruption.
Japan’s response to the post-tsumani nuclear disasters last March is an example. According to The New Yorker magazine, anti-nuclear activists there were optimistic that widespread plant shutdowns after the crisis would become permanent, but their optimism proved premature. It’s now assumed that most of the country’s reactors will re-open. To abandon nuclear power would be, in the words of Japan’s economics minister, “idealistic but very unrealistic.”
Global warming and other crises have caused many scientists and policy wonks to conclude that the only escape from the destructive effects of technological autonomy is more technology. Geoengineering envisions the application of techniques that seem borrowed from science fiction, among them fertilizing the ocean to boost the growth of CO2-absorbing phytoplankton and the manufacture of artificial volcanoes that would fill the atmosphere with clouds of heat-blocking particles.
One looks for hope where one can find it, but the problem here is obvious: Even if they did work for the purposes intended, nobody knows what the unintended results of such radical measures might be. Technological autonomy is a process that proceeds without regard to original intention.
Comments 10
Helge Peters — January 8, 2012
Interesting read.
Somewhere in the book Winner states that the question of autonomous technology is ultimately the question of human autonomy held up to a different light.
In this context, it would be interesting to think about technology and climate change using Bruno Latour's sociology of objects, where neither humans nor technology possess autonomy but instead form hybrid actors. The question of climate change would then change from 'how do we modify our technology to this or that end' to 'how do we negotiate our relationships with natural objects and technical artefacts in order to cohabit this world'
In order to facilitate this Latour proposes a 'parliament of things' where both persons and things would be represented. It temporarily convened during the exhibition 'Making Things Public' at Karlsruhe, Germany. You can watch a documentary about it here: http://www.youtube.com/watch?v=-5s264AjaXI
Doug Hill — January 8, 2012
Dear Helge Peters,
Thanks for your interesting comments. Regarding technological autonomy as human autonomy in a different light, I know Winner and other philosophers who endorse degrees of technological autonomy (including Jacques Ellul, a major influence on Winner, and Heidegger) would agree with that in principle. They would also insist, however, that technology can in fact inhibit human autonomy, for the various reasons I review. That humans theoretically can always take or resume control of their technologies is why I say we are in a condition of *de facto* technological autonomy.
I regret to say I am not very familiar with the thought of Bruno Latour (one of many gaps in my knowledge) but I'm suspicious of the idea of "hybrid actors" (a suspicion I realize will not be shared by many of those who read and contribute to this blog). Specifically I think it's important to consider humans/nature as separate orders of being from machines/technology. Such a separation preserves human dignity and avoids constructions that lump humans/nature together with machines/technology in a "parliament of things."
DH
Trust, Enterprise Security, and Autonomous Technology | CTOsite — January 27, 2012
[...] technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise [...]
Trust, Enterprise Security, and Autonomous Technology | CTOsite — January 29, 2012
[...] technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise [...]
Trust, Enterprise Security, and Autonomous Technology — January 31, 2012
[...] technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise [...]
Trust, Enterprise Security, and Autonomous Technology — February 1, 2012
[...] and Autonomous TechnologyFebruary 1, 2012 By The technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise [...]
Sherry Turkle’s Chronic Digital Dualism Problem » Cyborgology — April 23, 2012
[...] Winner reminds us that technologies have politics built into them, sometimes those politics are intentional, sometimes they are deliberate. It is certainly useful [...]
No Exit: Technological Autonomy in Japan » Cyborgology — May 14, 2012
[...] January I wrote an essay for the technology blog Cyborgology on the subject of technological autonomy and its implications for the environment. There’s no [...]
Trust, Enterprise Security, and Autonomous Technology - CTOlabs.com — May 19, 2013
[...] technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise [...]
Trust, Enterprise Security, and Autonomous Technology - CTOvision.com — December 13, 2014
[…] technology writer Langdon Winner wrote an interesting book 30 years ago that has a lot of relevance to technologists today–especially when thinking about enterprise […]