Three articles came out this week that help me develop my concept of droning as a general type of surveilance that differs in important ways from the more traditional concept of “the gaze” or, more academically, “panopticism.” There’s Molly Crabapple’s post on Rizome, the NYTimes article about consumer surveillance, and my colleague Gordon Hull’s post about the recent NSA legal rulings over on NewAPPS. Thinking with and through these three articles helps me clarify a few things about the difference between droning and gazing: (1) droning is more like visualization than like “the gaze”–that is, droning “watches” patterns and relationships among individual “gazes,” patterns that are emergent properties of algorithmic number-crunching; and (2) though the metaphor of “the gaze” works because the micro- and macro-levels are parallel/homologous, droning exists only at the macro-level; individual people can run droning processes, but only if they’re plugged into crowds (data streams or sets aggregating multiple micro- or individual perspectives).
Google Glass is a great illustration of the way droning layers itself on top of the infrastructure (both technological and cultural, like behaviors and norms for social interaction) of the gaze. Google Glass takes the gaze–individual sight, and uses it as the medium for data generation and collection. It superimposes droning onto the panoptic visual gaze; in this way droning is “super-panoptic” (to use Jasbir Puar’s term).
Crabapple’s article makes this superimposition really clear. On the one hand, Google Glass broadcasts her individual gaze: “Google Glass lets the government see the world from my perspective.” With Glass Gaze, I was giving the network the same opportunity.” But, on the other hand, “he network quantifies eyeballs. It can’t see what’s behind the eyes.” Her individual gaze is broadcast, but it’s just the medium for another type of cultural and economic production, in the same way that paint (pigment + emulsifier) is a medium for the production of a painting. The message is the medium, in a way, a medium for the production of datasets, which are then visualized (that is, algorithmically processed). We can’t see the message until the algorithms visualize it for us. We rely on drones–here, algorithms–to see in the first place. And those drones need data: “INPUT! More input!”
Over on NewAPPs, my UNC Charlotte colleague Gordon Hull has been writing about the recent court rulings on NSA surveillance programs. He argues that “data” is different than “information”–information is meaningful (it has semantic content) in and of itself, whereas “data” is meaningful only in relation to other data; but, we can only see that relationship when we process the data through algorithms designed to pick these relationships out of the enormous haystacks of data we’re constantly collecting. As he explains:
This is the data/information distinction at work: the data by itself (or in a vacuum) is meaningless – and may even be meaningless forever – but you cannot even know whether it will rise to the level of information until after you run the analytics (hence my claim that privacy arrives too late). In this, I think, big data is charting new territory, insofar as older kinds of surveillance did not extensively collect material that was not obviously meaningful in some way or another.
So this new kind of surveillance isn’t inserting itself into our lives by invading our privacy and listening to or stealing our information. Rather, it compels and rewards us for generating data. We have to feed the algorithms. That’s one way to read Crabapple’s claim that “In a networked world, we’re all sharecroppers for Google.” (And, somewhat aside, if labor is historically less protected than privacy is, perhaps that’s one of the insidious effects of this type of surveillance? It both eviscerates privacy (as Hull argues) and turns surveillance into labor performed by the surveilled?).
As I read Hull, he’s arguing that “data” is necessarily big–”all data needs to be available for collection, since we can never know what data is going to be meaningful.” So, not only does the government need to collect all the data that’s made, but it’s actively interested in getting us to produce all the data that could possibly exist.
II. “More eyes, different eyes”
One of the really interesting things Crabapple’s drawing performance does is perspectivally multiply the individual user’s “gaze.” She explains: “I was caught between focusing on the physical girl, the physical paper and the show that was being streamed through my eyes.” With three different perspectives before her eyes, Crabapple’s experience mimics, in scaled-down form, droning’s “perspectival” vision.
Droning, as a type of surveillance, isn’t the perspective of one individual viewer. It’s not the Renaissance vanishing-point perspective that you learn about in high school art class. That sort of perspective is oriented to the gaze of a single viewer. Droning is perspectival in Nietzsche’s sense of the term: it aggregates multiple single-viewer perspectives in a way that supposedly provides a more circumspect, more accurate account than any single perspective could. He argues,
we can use the difference in perspectives and affective interpretations for knowledge…There is only a perspectival seeing, only a perspectival ‘knowing’:…the more eyes, various eyes we are able to use for the same thing, the more complete will be the ‘concept’ of the thing, our ‘objectivity’ (Genealogy of Morals, Second Essay section 12).
This goes back to what I said earlier, via Hull, that data has to be big—the most valuable knowledge isn’t “information”–the content of an individual perspective, what a gaze sees–but the “difference in perspectives,” the relationships among “more, various” eyes. For example, when I use Yelp or TripAdvisor or some other internet ratings site, I don’t trust any one reviewer over another–I’m looking for consistent patterns across reviews (e.g., did everyone describe similar problems?). In order for there to be patterns, there has to be a largeish pool of data, enough reviews to establish a trend.
So, droning needs as many “eyes” as possible to generate data. Enter Tuesday’s NYTimes article by Quentin Hardy on the growing ubiquity of webcams, webcams attached, even, to tortoises. Tortoisecam, according to Hardy,
illustrates the increasing surveillance of nearly everything by private citizens…[T]he sheer amount of private material means an enormous amount of meaningful behavior, from the whimsical to the criminal, is being stored as never before, and then edited and broadcast.
Droning outsources regular old panoptic surveillance to private citizens, often to do in their leisure time and/or second-shift labor (e.g. nannycams). As a matter of droning,issue with this widespread private surveillance isn’t privacy (to extend Hull’s argument), because droning isn’t collecting information–it isn’t interested in the camera’s gaze.  Rather, droning is interested in the data created by the camera. It actively encourages the proliferation of private surveillance cams because it needs “more eyes, various eyes,” as many eyes as it can get…even from tortoises.
III. So Why Call It Droning?
At first glance, the kind of surveillance I call “droning” doesn’t seem to be closely related to autonomous aerial vehicles. However, I think “droning” is the right term for a few reasons:
First, it’s omnipresent. It’s a constant background, like a musical drone in, say, Indian classical music.
Second, it’s an autonomous process run by data-generating, data-saving, and above all by number-crunching hardware and software. It doesn’t need to be operated by or to correspond to the gaze/perspective of an individual human being. Instead, and
Third, as I discussed above, droning happens at the level of the swarm, the flock, the population. UAV (unmanned aerial vehicles) are operated by teams, and they rely on both living and autonomous/machinic/digital team members. The vehicle is just the representative of its constituents, its team. Droning is not something one person does to another, or a tyranny of a “majority” over a minority; droning is of, by, and for ‘the people’. 
 Because droning isn’t in the camera’s gaze, droning works differently than the “male gaze,” at least as it is classically conceived in feminist film theory. Mulvey argues that the camera’s gaze is what sutures the male gaze, what makes it seem and feel authoritative. But droning isn’t in the camera’s gaze; it’s in the camera’s metadata. In my forthcoming book with Zer0 Books, I have a chapter that discusses the ways the male gaze has been reworked by contemporary media and by biopolitics.
 Hardy’s Times article also states: “Evan Selinger, an associate professor of the philosophy of technology at the Rochester Institute of Technology. “Should the contractor like being seen all the time? What happens to the family unit? Sometimes the key to overcoming resentment is being able to forget things.” In this last sentence Selinger is referring to the opening sections of the Second Essay of Nietzsche’s Genealogy–the same text I cited above regarding perspectivism. There, Nietzsche argues that “bad conscience” or “ressentiment” is due, in part, to the inability to “forget” or “get over things.” (I know that’s a massive oversimplification of his argument, but, you can read it for yourself in the above-linked copy of the text.) Maybe Selinger has been edited/quoted in a way that misrepresents his claim, but I don’t think big/ubiquitous surveillance fails to forget. Droning forgets–in fact, the data/information distinction is a great illustration of precisely the kind of Nietzschean “active forgetting” Selinger alludes to in his comment. There is an explicit choice to let data sit fallow as just data (not information); that choice is coded into the algorithm itself.
 Is there any consensus on what a flock of drones is called? I asked about this on Twitter earlier this week, and Sarah Jeong suggested calling it “an unconstitutionality of drones,” which I may like even better than my suggestion of calling it a “murder of drones,” after a murder of crows.
Robin is on Twitter as @doctaj.
More eyes, different eyes: droning & Google... — January 11, 2014
Friday Roundup: Jan. 10, 2014 » The Editors' Desk — January 17, 2014
[…] death and happiness (no word on taxes) in the quantified life and the proliferation of eyes, welcoming drone viewpoints to the accumulation of individual perspectives. Finally, what’s up […]
Glasslinks: Privacy, Glassholes, Panics, & Take-Backs — February 24, 2014
[…] More Eyes, Different Eyes: Droning & Google Glass | Cyborgology […]