There has been a lot of talk about magic lately in critical, cultural and technological spaces; what it does, who it is for, and who are the ones to control or enact it. As a way of unpacking a few elements of this thinking, this essay follows on from the conversations that Tobias Revell and I, and a whole host of great participants had at Haunted Machines, a conference as part of FutureEverything 2015 which examined the proliferation of magical narratives in technology. With our speakers we discussed where these stories and mythologies reveal our anxieties around technology, who are the ones casting the spells, and where – if possible – these narratives can be applied in useful ways.
As an ex-literature student, I’m quite interested in ghost stories as analogy, because they can reveal, or be an interesting way of exploring, these anxieties; where the voices in the static are coming from, where the pipes are creaking, and what they tell us about what our technology is doing or can potentially do to us.
I’m going to use a load of slightly ham-fisted contemporary narratives to signpost the anxieties that come out of two personal and increasingly algorithmically mediated spaces: the social network and in the home. Where does the role of narrative in magic, the supernatural, and the unknown allow us to get a better grasp of technology’s power over us? Where are the uncertain terrains of our technologies creating the capacity for hauntings, and where can techniques used to imagine future scenarios better equip us for the ghosts to come? When we think of a haunting, we think of the unseen forces acting upon our domestic space, and when considering technology, a reappropriation of Clarke’s third law that Tobias Revell summoned with his work on Haunted Machines can apply– Any sufficiently advanced hacking is indistinguishable from a haunting. But where else are we haunted?
During the Victorian era, the belief and practise of spiritualism reached a peak as a result of the anxieties and fears about the technological advancements of the industrial revolution, where machines had taken over control that was previously held by humans. In spiritualist circles, the great beyond was supposedly contacted, and therefore became rife for fraud and the abuse of our most sensitive worries about the afterlife. One of the most famous mediums of the time was the Boston Medium Mina ‘Margery’ Crandon who, through the use of sleight of hand and small, crudely technical devices, conned hundreds into thinking she was channeling the dead. Crandon’s trickery was eventually debunked by Harry Houdini as part of his great spiritualist hunt – where he challenged spiritualists to prove they were contacting the dead – subjecting her tactics to sophisticated and rigorous testing. People watching believed (or wanted to believe) in her spiritual gifts, not the technological techniques she used, but once they were revealed, the magic disappeared.
During this period, writers like Edgar Allen Poe published stories such as The Tell Tale Heart, which explored these supernatural worries of the unseen, and the unknown, allegories for the rapidly changing world around them at the time. These narratives conjured up a situation where the potentially spiritually dangerous could be investigated, and found out, as an imagined resolution to that which they could not understand.
Fast forward a hundred years to the 1950s, when technology was marketed to us as an accelerated entry into a world of science fiction. Now jump briefly to the 1980s and 90s, when technology shifted into the realm of magic.
In this advertisement by Honeywell, a spectral presence escapes from the machines. Emails, you say? How do they work! Solution: Magic! Like the Sorcerer’s Apprentice, the hard things you don’t want to deal with, like emails, are passed off to magical processes.
Motifs of individual empowerment-through-magical-computers can still be found in Apple’s sloganeering. Their phrase ‘It just works’ angers me the most. “You don’t need to worry how it works,” Apple tells you, “just that it does.” The implication is that you don’t need to know what you’re agreeing to when you allow this device or software to work around you. You are positioned as the magician’s assistant, or rather, you’re not even that. You’re the nervous audience member dragged on stage that makes the magician look better, or clever, or supernatural. The advertisement for their most recent public campaign, ‘You’re more powerful than you think’, is nothing but a insidious obfuscation technique, making you think that you are doing the magic when in fact you’re a component, and an ingredient, in a much more complex set of darker magical happenings. When you choose to be part of the magic, accepting the terms and conditions of use, are you allowing yourself to be possessed?
Possession, in the magical sense, is the loss of control of your own personal faculties. In the case of algorithmically mediated culture, we can see where these problems of agency, control and intent – where our power is compromised – create the capacity for ghosts to take hold of us. The fact that companies are pushing forward this narrative as a source of empowerment, making things easier, is one for us to analyse why pulling apart these technologies is difficult for us.
I am by no means the only person to experience algorithmic gaslighting. After being the victim of rape, student Retaeh Parsons was bullied for months on and offline and eventually took her own life. Her image was circulated across media channels, blogs, and social media sites, where it was eventually collected into the databanks of an image scraping algorithm used by a third party service. Months later, photos of her appeared months later in a Canadian dating advert on Facebook. Her family, friends, and others that knew her story were rightly horrified. At first they blamed the algorithm, a natural reaction when you’re not quite sure how this error happened. When things like this occur, we imagine there is something, or someone, to prevent this behaviour, we don’t anticipate that this decision was governed by an algorithm blindly gathering images of women from a specific demographic.
Algorithms do not know the context of a photograph. They don’t understand, or anticipate the social consequences of their own function. They do not have our faulty methodology, the algorithm is blameless; it is us as creators who are essentially at fault, it is our faulty application. However, this consistent failure to understand the wider systems at work means that the algorithm will continue to act against us unknowingly (as well as with us). It is becoming the benign ghost on the network, walking about without even knowing we are there, or of our fear of it.
Perhaps it was only a matter of time before movie studios tried to one-up social media companies in producing disturbing, digitally mediated, moments. Unfriended (trailer), recently released in the US, is a horror film in which a dead women supposedly ‘haunts’ her friends through various social media channels, which isn’t that different to how death actually operates in social media spaces. Algorithms have a way of animating the long-past choices of the recently departed.
In traditional ghost stories, or people’s accounts of the more conventional hauntings, we hear accounts of spectres that follow you around and knock over the things in your house. But disrupting the environments we create online provides the digital poltergeist with a brand new range of possibilities for emotional violence.
The problem here is that you don’t really see the ghost until you notice it, and once you notice it you don’t know what to do with it. You become powerless. What would an exorcist look like in this context?
Leaving the digital to look at a physical personal space, the retrofitted smart kitchen, once lovingly owned and upgraded, but now vacated, becomes an algorithmically populated tomb of needs, wants, aspirations. Some paranormal investigators – those who use often highly technological instruments to investigate the existence of the spiritual – believe that the house is a stone tape. Proposed by Thomas Charles Lethridge, the stone tape theory states that life can be stored in rock, brick, and other items, and “replayed” under certain conditions such as emotional triggers or historic anniversaries and events. Of course, the idea of stone recording memory is under scrutiny, but if there are systems, software and cables—from Nest thermostats to instrumented materials–inside the ‘stone’ that are recording your movements, your shopping and temperature preferences, always listening and acting upon your day-to-day life, then this is no longer a mythology. While you’re living in this connected home, you are inviting these houseguests in.
Sometimes digital poltergeists enroll the living to enact their violence. We’ve all experienced this a little when a bank automatically freezes your bank account when you’re traveling because a security analyst instructs that this is irregular, and therefore ‘suspicious’ behaviour for a human to have. Then there’s the more extreme cases, those systems that put together your search history into a potential terrorist threat. The story of Michelle Catalano and her husband, who became the targets of a visit by federal officials based on nothing more than their Google history, seem disturbingly close as you learn about the case. Search terms including ‘backpack’ and ‘pressure cookers’, together with news articles their son had looked up out of curiosity, had supposedly identified them as a target of counter-terrorism. In this case, the NSA denied using the data and search history of ‘average’ citizens. Must have been the ghosts.
And although you can hard-reset your kitchen, and delete your browser history, the profile you’ve built up could still exist in corporate clouds and government repositories, stacks of temporary files mounting up on each other like layers of sediment, each one a frozen profile.
Artist Wesley Goatley, in programming his black boxes for Wireless Fidelity, an artwork that maps sounds to SSIDs to sonify a city’s wifi networks, had firsthand experience of this phenomena. On the last day, his boxes stopped working, because a file on the USB wifi adaptor was filled up with temporary data and metadata. Identifying the errant data, or even knowing where to look can only be solved by and through specialist knowledge.
These opaque systems suggest that algorithmically-mediated systems, even when they need a human’s help, will often obscure their inner workings. It is already hard enough to find a leaky pipe in a wall. Imagine how difficult it will be to find, patch, and clean up a leaky datastream from your fridge. And just like a water leak could lead to black mold hiding under sheetrock, so too could leaky data lead to dangerous but hard-to-detect consequences.
Thinking about the automated house as a haunted house can help us reassess imagined threats. We think it’s the hackers that we have to worry about, rather than the companies and organisations who are using algorithms, building backdoors into our technology, and enabling biased/prejudices modes of search in our homes. The NSA and GCHQ position themselves as friendly ghosts, house guardians to watch out for us, but we know that this supposed benevolence, isn’t.
It’s not too much of a stretch to imagine landlords and building managers gaslighting you or giving you up. We may soon be faced with a Faustian bargain that gives you a shiny smart kitchen, but at the cost of home monitoring equipment: sensors that know when you smoke in the house or sublet a room on AirBnB.
So what now?
In many ghost narratives the ghost goes away, either exorcised or given closure (as in Ghost, where Patrick Swayze disappears beyond the veil), but what if you can’t do that? We are realizing that algorithmic sorting, mediating, filtering and impact isn’t going to go away, because our contemporary networks are built on it. There are things that we can do to obstruct and confuse these systems – at the cost of our ability to benefit from it – by flooding our personal profiles with false and excess data to throw the systems off the scent, but how do we understand the longevity, and the potential and plausible futures, of algorithmic mediation?
There are a few examples of this acceptance and enactment of control on systems that we can’t banish in contemporary popular narratives. At the end of the recent Australian film The Babadook, [EDITORS’ NOTE: Spoilers! Continued safe reading in the next paragraph.], the ghost doesn’t go away; instead it is kept, with caution, in the basement, where it is fed regularly, and strictly, and consistently understood as a potentially negative influence, while knowing exactly how much rope to give it. The owners of the house have learned how to control it, because they understand it.
So is there a place for invoking hauntings, in order to understand your role in creating ghosts? Because, as Derrida reflects (because you can’t discuss hauntings in contemporary culture without dropping him in there somewhere) ‘The ghost remains that which gives one the most to think about – and to do.’
We can abstract these terms ‘ghost’ and ‘haunting’ to also mean ‘there were things before you, and things after you that will haunt others’. Technological ghosts are a constant reminder of the systems that came before, and particularly those ghost futures that we have cancelled and written over as their visions became outdated: a series of never-will-be moments. A better understanding of our capacity for haunting, may in turn allow us to better imagine the technological futures we want.
Is there a case for how the narratives of haunting, and the ghosts that can potentially emerge, can be helpful? I’m pondering a form of code-based foresight – a means of exploring possible near future worlds and scenarios – that allows for cross-disciplinary conversations in engineering to happen, creating narrative near futures to know where our technology could end up, and invoke the ghosts that appear as we lose control. Speculative scenarios and foresight are not new, but I are there specific methods that give engineers space to explore the impact of their work in non-technical, sociological ways? Computational systems are not neutral, as they are often presumed to be; they have political, cultural and social biases written in to their selection and application. It’s not just about making the products that will use these algorithms, but exploring the algorithms themselves as they are appropriated, applied, reused, misused. It is about exploring where our ghosts will exist, and how, and who, they will haunt.
Natalie Kane is a curator, writer and researcher working across culture, futures and technology. She works at FutureEverything in Manchester, futures research lab Changeist, and together with Tobias Revell is co-organiser of Haunted Machines, a project interrogating narratives of magic and technology.