Most of us here at Cyborgology have written at least one post about augmented warfare and revolution. I suggested that the panopticon has moved to the clouds, and PJ warns that we may soon see it descend into a fog. In the wake of the Arab Spring, we have all commented on what it means to have an augmented revolution (also here, here, and here). The Department of Defense is well aware of this global trend, and is dumping lots of money into understanding how to maintain what I will call online superiority. Just as nations fight for ground, air, and sea superiority in a given conflict, they must now maintain a presence in online meeting spaces. Surveillance and intelligence efforts have always been a part of warfare, and monitoring and disrupting information flows has always been a tactical advantage. While previous engagements in informational warfare have been about information exchange, what we see now are efforts to gain online superiority in order to directly disrupt physical, financial, or tactical resources.

A good example is Stuxnet, the malware jointly developed by Israeli and American forces to disrupt the Iranian nuclear program. By sabotaging a relatively mundane piece of hardware, they were able to severely damage Iran’s uranium-refining capacity. But the United States is aware that other entities are working on similar weapons. They recognize the increasing importance of online meeting spaces as the originators of this sort of action. Outside threats might include China and Russia hacking into infrastructure control networks. Perceived domestic threats include the various Anonymous hacker collectivities such as Lulzsec or AnitSec. And of course, there’s always Wikileaks.

DARPA has released a solicitation to companies that can help the government with the” four basic goals” in the area of Social Media in Strategic Communication:

  1. Linguistic cues, patterns of information flow, topic trend analysis, narrative structure analysis, sentiment detection and opinion mining;
  2. Meme tracking across communities, graph analytics/probabilistic reasoning, pattern detection, cultural narratives;
  3. Inducing identities, modeling emergent communities, trust analytics, network dynamics modeling;
  4. Automated content generation, bots in social media, crowd sourcing.

Just as automated drones are providing total battlefield awareness, this solicitation belies a desire for high-level surveillance of large groups of people. The curation of information is, once again, extremely important. As David Streitfeld  noted in the New York Times’ Bits Blog, this technology “would be, at its most basic level, an Internet meme tracker.” In the civilian world, we have sites like Klout and Hunch that crawl your various social media profiles and determine your network influence or what kind of wallet you prefer. DARPA is much more interested in what kind of political movement you influence, and what sorts of foreign governments you prefer.

Online superiority means the panopticon extends into our online lives as well. The Foucaldian panopticon works in three stages: 1) an individual knows they are subject to surveillance but, 2) the act of surveillance is not observable by the surveilled, thus 3) one internalizes the surveillance, acting as if they are constantly being watched. We are already encouraged to act this way as a reaction to the moral panic of online privacy. We are  told to worry about online predators and potential employers, but after 2015 whenDARPA hopes to complete this project, we’ll have to worry about our government’s opinions of what constitutes appropriate online behavior.