The online magazine Slate recently ran an essay that asked the question, “Why Do We Love To Call New Technologies ‘Creepy’?” The article was written by Evan Selinger, an associate professor of philosophy at Rochester Institute of Technology. My initial reaction to that essay, posted on my blog, was critical, but Selinger suggested in a tweet that I’d missed his point, which he said was “‘creepy’ discourse + normative analysis.” He’s right, I’m sure, that I missed his point – to be honest I don’t know what “normative analysis” is. So, with apologies to Selinger, I’ve reworked the essay to ask, simply: What is it about some technologies that makes us feel creepy?
There’s an obvious correlation between creepiness and novelty. It’s not unusual to be suspicious of strangers, especially when they have the potential to effect some degree of change in our habitual sense of the world. With technologies as with people, a measure of trust has to evolve.
Selinger’s essay mentions that early railroad passengers sometimes developed a variety of symptoms that physicians came to recognize as manifestations of “train sickness.” He suggests these maladies were a reaction to the creepiness of unfamiliarity, a form of “mania” that simply disappeared with time. Without going into detail (or normative analysis), it’s worth noting that the experience of early train travel was considerably rougher and more dangerous than it would become as technologies of comfort and safety evolved.
Still, I don’t doubt that (to borrow Robert Hughes’ phrase) the “shock of the new” had something to do with passengers’ uneasiness. As a given technology weaves its way into our lives the creepiness factor usually fades, as does its “specialness” factor. We become acclimated to its presence, and then dependent on it. The miraculous and frightening become routine. I say the creepiness factor “usually” fades because it doesn’t always. Plenty of people still find flying on airplanes creepy, for example. I’m one of them.
There are two less obvious issues that help explain the creepiness we often feel in response to technology. One is that we’re intuitively aware that, in terms of brute strength, technological power outstrips human power. You don’t have to be a religious fundamentalist to see that technology is about achieving a degree of mastery over nature, other human beings, and ultimately death, what was once believed to be the exclusive purview of God. But we’re also aware that technological power cuts both ways, and thus is not only a source of security, but also fear.
Immediately after reading Selinger’s essay I happened to catch on TV a showing of the old Katherine Hepburn/Spencer Tracy movie, Desk Set. The plot, for those who don’t know it, revolves around a group of women who staff the research department of a major television network. Much of their working day is spent answering questions from the public – someone wants to know who had the highest career batting average in the history of baseball, someone else asks for the names of the reindeer in “The Night Before Christmas.” The women handle these calls with an impressive combination of dedication, good humor, and smarts.
The conflict in the picture is supplied by the installation in their department of a new, room-sized computer named EMERAC (a variation of the names of the early computers UNIVAC and ENIAC), which the researchers assume has been brought in to replace them. It turns out (spoiler alert here) that the movie isn’t really about the threat of automation – that’s just an excuse for a romantic comedy that revolves around Hepburn’s researcher falling in love with Tracy’s efficiency expert/computer engineer. It’s a formula that requires a happy ending, and indeed, in the end we learn that the researchers aren’t fired, EMERAC is only there to help them.
That the plot goes in this direction perhaps explains why there’s a note in the film’s opening credits thanking IBM for its assistance in the production. In any event, by the end, Hepburn’s character (who’s named, surely not by coincidence, Miss Watson) is cheerfully learning to use the computer and cooing affectionately as it spits out answers to questions. Human and machine learn to live in mutually supportive collaboration; mistrust and fear give way to admiration and gratitude; the researcher accepts the engineer’s proposal of marriage.
This seems a bit disingenuous. In truth, there’s good reason for the researchers of Desk Set to fear the arrival of EMERAC. Countless workers, from the onset of the Industrial Revolution to today, have been replaced by machines. Technology has the power to make us obsolete, and we know it. That’s creepy.
The second fundamental issue that the creepiness question raises is an existential one. It involves the alienation that exists between two separate orders of being: the organic and the mechanical. That’s what the uncanny valley is about, I think. We instinctively recognize that a machine is trying to sneak across that boundary, and it puts us on our guard. A similar discomfort may be at the root of the creepiness some people feel about flying: there’s just something unnatural about it.
Desk Set gets lots of mileage out of this tension. The computer and the female technician who’s brought in to attend it are both portrayed as cold, relentless intruders into a human community. Stanley Kubrick played brilliantly on that tension, too, defining the character of HAL in 2001 with two radically incongruous features: a heartless, staring eye and a voice that oozed creamy sincerity.
I realize not everyone agrees that a firm line exists between human beings and technology. There’s no reason, many believe (including many here at Cyborgology!), that we can’t share ontological space with one another. Certainly the transhumanist point of view is that nothing could be more natural than humans merging with their machines. “[I]t is our special character, as human beings, to be forever driven to create, co-opt, annex, and exploit nonbiological props and scaffoldings,” writes Andy Clark, author of Natural Born Cyborgs. “…Tools-R-Us, and always have been.”
I don’t buy it, or, more accurately, I don’t buy the implication that such adaptations are necessarily desirable. Human/machine intimacy is as likely to produce mutation as it is enhancement, in my opinion. This is a view that draws me to the work of artists like David Cronenberg and Philip K. Dick and philosophers like Jacques Ellul and Herbert Marcuse.
One of the more eloquent expositions of this perspective came from the theologian Paul Tillich. Like other existentialists, Tillich believed that uneasiness is endemic to the human condition. It’s weird being aware that we exist and weird knowing that we’re going to die. Our predicament leaves us with persistent feelings of, as Tillich put it, “uncanniness.”
We’ve come up with lots of ways to avoid those feelings, and technology is high on the list. On one level we find technology reassuring because we think we can control it. Even though we may not understand how it works, we believe it behaves by rational, logical, “calculable” rules. We can surround ourselves with it, cloak ourselves in it, and feel secure. Tillich cites the home as an example. Its “coziness,” he wrote, holds “the uncanniness of infinite space” at bay. What the house or apartment offers individuals, the city offers humans en mass.
Like so many palliatives, however, technology can turn on us. It may not be as safely in control as we’d hoped. The potential for unease grows as our technologies become more powerful, more complex, and more self-determined. On some level we’re aware that the relentless logic they’re following is their own. We know they’re not truly alive, but they seem to be. We wonder whose agenda is being followed. Creepiness ensues.
“As the technical structures develop an independent existence,” Tillich wrote, “a new element of uncanniness emerges in the midst of what is most well known. And this uncanny shadow of technology will grow to the same extent that the whole earth becomes the ‘technical city’ and the ‘technical house.'”
Tillich ends this passage with a pertinent, and creepy, question: “Who can still control it?”
This post is also available on Doug Hill’s personal blog: The Question Concerning Technology.