tumblr_mjomlbENMq1s5k9amo1_1280

Murmuration (a month-long festival of drone culture) is in full swing, and while I still plan to write a retrospective post once it’s all done, there are naturally already themes emerging as common to a number of the pieces. And like all powerful themes, they transcend the pieces themselves, speaking to wider technological and social issues, revealing existing things while pointing the conversational way forward. I think one of these themes in particular is worth some particular attention, not only for what it says about drones but for what it says about war in general.

Both Olivia Rosane and Nathan Jurgenson – as well as many other people, in the festival and out of it – have observed that one of the primary features of much of our drone fiction is the removal of the human element, both the human operator and the human casualties (Olivia also makes the extremely important point that drone fiction can and should tear down this project). Our drone fiction denies the presence of human operators; it renders drones autonomous. The consequences of this are significant and significantly troubling.

When a drone is autonomous, there is no one to blame, no one to feel guilt, and no culpability on our own part. The killing of civilians or the surveillance of citizens can be explained away as the act of an ineffable drone god. Even the power of the state behind the drone is erased, or at least subtly minimized – although, as Asher Kohn points out, a drone is also a perfect citizen, a perfect subject of state and corporate power. But for the rest of us, a drone is just there, its power nebulous and yet intensely present, and without human responsibility no one can be held responsible for what it does. As Nathan writes:

My worry is that the agency and humanization many grant the drone deflects the intentionality, and thus responsibility, away from those controlling it. Caught in the fascinating ways the drone is “autonomous”, we spend far too few words on the overwhelming degree to which the drone is no more autonomous than previous tools of surveillance and/or destruction.

So drones aren’t autonomous. The stories we tell about drones need to reflect that; the best stories are true stories, even when they’re fictional, and our drone fiction needs to tell the truth. But what we also shouldn’t forget is that just because drones – the actual vehicles themselves, and in particular here I’m talking about vehicles designed for use in combat – aren’t autonomous yet doesn’t mean they won’t be.

And in fact this is the true, terrifying goal of the drone, the reason it exists at all: the removal of any obvious humanity from the equation.

This isn’t science fiction. The Global Hawk surveillance UAV currently operates with almost no human operator control except in takeoff and landing. Work on creating a more autonomous drone continues.

The thing about this is that, as Nathan says above, none of it is new, at least not in aim. This has always been what we do when we wage war. One of the identifiers of “total war” is industrialism, the degree of technological sophistication, and the goal of technology in war is almost always to kill more of the enemy while leaving more of one’s own forces intact.

The goal of technology in war is, in other words, to make killing more efficient. And humans are profoundly inefficient. It’s also extremely difficult – despite appearances – to get people to kill each other. Studies of soldiers in combat have revealed that a significant number of them intentionally fire over enemy heads rather than shooting to kill. But technological warfare as it’s practiced now tends to distance the killers from the killed (with notable exceptions). With the advent of aerial bombardment that distance became literal. When mass death is reduced to numbers – to physics and casualty counts – it becomes abstract, its various components as nebulous and therefore as blameless as a machine without a mind. Technological warfare obscures – it hides and removes as much as it purports to clarify.

Combat drones are the logical next step in this project. Our common fiction about them erases the people at both ends – the pilots and the casualties. But our fiction is based in something; it always is, that’s what makes it so powerful, but in this case we should be especially careful to make sure that we understand what exactly that base consists of.

On some level we want to remove human agency from drones, yes. That’s why we talk about what we talk about when we talk about drones, why we tell the stories we do. But the thing about fiction is that the line between it and fact is – as all lines are – extremely porous. We should watch what moves.

Sarah reveals and obscures in equal measure on Twitter – @dynamicsymmetry