Thiel - Girard

During the week of July 12, 2004, a group of scholars gathered at Stanford University, as one participant reported, “to discuss current affairs in a leisurely way with [Stanford emeritus professor] René Girard.” The proceedings were later published as the book Politics and Apocalypse. At first glance, the symposium resembled many others held at American universities in the early 2000s: the talks proceeded from the premise that “the events of Sept. 11, 2001 demand a reexamination of the foundations of modern politics.” The speakers enlisted various theoretical perspectives to facilitate that reexamination, with a focus on how the religious concept of apocalypse might illuminate the secular crisis of the post-9/11 world.

As one examines the list of participants, one name stands out: Peter Thiel, not, like the rest, a university professor, but (at the time) the President of Clarium Capital. In 2011, the New Yorker called Thiel “the world’s most successful technology investor”; he has also been described, admiringly, as a “philosopher-CEO.” More recently, Thiel has been at the center of a media firestorm for his role in bankrolling Hulk Hogan’s lawsuit against Gawker, which outed Thiel as gay in 2007 and whose journalists he has described as “terrorists.” He has also garnered some headlines for standing as a delegate for Donald Trump, whose strongman populism seems an odd fit for Thiel’s highbrow libertarianism; he recently reinforced his support for Trump with a speech at the Republican National Convention. Both episodes reflect Thiel’s longstanding conviction that Silicon Valley entrepreneurs should use their wealth to exercise power and reshape society. But to what ends? Thiel’s participation in the 2004 Stanford symposium offers some clues. more...

IMAGE1One of the most prominent theorists of the late 20th century, Michel Foucault, spent a career asking his history students to let go of the search for the beginning of an idea. “Origins” become hopelessly confused and muddled with time; they gain accretions that ultimately distort any pure search for the past on the terms of the past. Instead, his alternative was to focus on how these accretions distorted the continuity behind any idea. This method was called “genealogy,” by Nietzsche, and Foucault’s essay expanded on its use. Dawn Shepherd captured the significance of this lesson in a beautiful, single sentence: “Before we had ‘netflix and chill ;)’ we just had ‘netflix and chill.’”

The temptation with something as recent as the web is to emphasize the web’s radical newness. Genealogy asks that we resist this demand and instead carefully think about the web’s continuity with structures far older than the web itself. While genealogy is not about the origins of “chill,” genealogy emphasizes the continuity of “chill.” Genealogy must build from an idea of what “chilling” entailed to say something about what “chill” means now.

Conversations about these continuities animated many of the conversations at Theorizing the Web 2016. Both the keynote panels and regular sessions asked audiences to imagine the web as part of society, rather than outside of it. In the words of its founders, the original premise of the conference was “to understand the Web as part of this one reality, rather than as a virtual addition to the natural.”

20100915 019

As a rule, parents tend to experience concern about their children’s wellbeing. With all of the decisions that parents have to make, I imagine it’s near impossible not to worry that you are making the wrong ones, with consequences that may not reveal themselves for years to come. I’m that way with my dogs, and I feel confident the anxiety is more pressing with tiny human people. This is why recommendations from professional organizations are so important.  They offer the comfort of a guiding word, based presumably in expertise, to help parents make the best decisions possible.

The American Academy of Pediatrics (AAP) is one such organization, and they have some things to say about children and screen time. However, what they have to say about children and screen time will be revised in the next year, and no doubt, many parents will listen intently. NPR interviewed David Hill, chairman of the AAP Council on Communications and Media and a member of the AAP Children, Adolescents and Media Leadership Working Group. Although Hill did not reveal what the new recommendations would advise, the way he talked about the relationship between screens and kids did reveal a lot about the logic that drives recommendations.  The logic that Hill’s interview revealed made clear, once again, the need for theory to inform practice. More specifically, those who make practical recommendations about technology should consult with technology theorists. more...

Photo Credit: Bill Dickinson

Science, to borrow a phrase from Steven Shapin, is a social process that is “produced by people with bodies, situated in time, space, culture, and society, and struggling for credibility and authority.” This simple fact is difficult to remember in the face of intricate computer generated images and declarative statements in credible publications. Science may produce some of the most accurate and useful descriptions of the world but that does not make it an unmediated window onto reality.

Facebook’s latest published study, claiming that personal choice is more to blame for filter bubbles than their own algorithm, is a stark reminder that science is a deeply human enterprise. Not only does the study contain significant methodological problems, its conclusions run counter to their actual findings. Criticisms of the study and media accounts of the study have already been expertly executed by Zeynep Tufecki, Nathan Jurgenson, and Christian Sandvig and I won’t repeat them. Instead I’d like to do a quick review of what the social sciences know about the practice of science, how the institutions of science behave, and how they both intersect with social power, class, race, and gender. After reviewing the literature we might also be able to ask how the study of science could have improved Facebook’s research. more...

Editors Note: This is based on a presentation at the upcoming  Theorizing the Web 2015 conferenceIt will be part of the Protocol Me Maybe panel. 


I’ve been researching hacking for a little while, and it occurred to me that I was focusing on a yet unnamed hacking subgenre. I’ve come to call this subgenre “interface hacks.” Interface hack refers to any use of web interface that upends user expectations and challenges assumptions about the creative and structural limitations of the Internet. An interface hack must have a technical component; in other words, its creator must employ either a minimal amount of code or demonstrate working knowledge of web technologies otherwise. By virtue of the fact they use interface, each hack has aesthetic properties; hacks on web infrastructure do not fall in this category unless they have a component that impacts the page design.

One of the most notable interface hacks is the “loading” icon promoted by organizations including Demand Progress and Fight for the Future in September 2014. This work was created to call attention to the cause of net neutrality: it made it appear as though the website on which it was displayed was loading, even after that was obviously not the case. It would seem to visitors that the icon was there in error; this confusion encouraged clicks on the image, which linked to a separate web page featuring content on the importance of net neutrality. To display the icon, website administrators inserted a snippet of JavaScript — provided free online by Fight for the Future — into their site’s source code. A more lighthearted interface hack is the “Adult Cat Finder,” a work that satirizes pornographic advertising in the form of a pop-up window that lets users know they’re “never more than one click away from chatting with a hot, local cat;” the piece includes a looping image of a Persian cat in front of a computer and scrolling chatroom-style text simply reading “meow.” The links to these, and other interface hacks, are included at the end of this post. more...


I have a secret to tell all of you: I kind of don’t care about teaching evolution in science classes. Put another way, I’m less than convinced that most people, having learned the story of species differentiation and adaptation, go on to live fuller and more meaningful lives. In fact, the way we teach evolution ­­––with a ferocious attention toward competition and struggle in adverse circumstances–– might be detrimental to the encouragement of healthy and happy communities. I also see little reason to trust the medical community writ-large, and I cringe when a well-meaning environmentalist describes their reaction to impending climate change by listing all of the light bulbs and battery-powered cars they bought. I suppose –given my cynical outlook– that the cover story of this month’s National Geographic is speaking to me when it asks “Why Do Many Reasonable People Doubt Science?” Good question: what the hell is wrong with me? more...


There’s a tricky balancing act to play when thinking about the relative influence of technological artifacts and the humans who create and use these artifacts. It’s all too easy to blame technologies or alternatively, discount their shaping effects.

Both Marshall McLuhan and Actor Network Theorists (ANT) insist on the efficaciousness of technological objects. These objects do things, and as researchers, we should take those things seriously. In response to the popular adage that “guns don’t kill people, people kill people,” ANT scholar Bruno Latour famously retorts:

It is neither people nor guns that kill. Responsibility for action must be shared among the various actants

From this perspective, failing to take seriously the active role of technological artifacts, assuming instead that everything hinges on human practice, is to risk entrapment by those artifacts that push us in ways we cannot understand or recognize. Speaking of media technologies, McLuhan warns:

Subliminal and docile acceptance of media impact has made them prisons without walls for their human users.   

This, they get right. Technology is not merely a tool of human agency, but pushes, guides, and sometimes traps users in significant ways. And yet both McLuhan and ANT have been justly criticized as deterministic. Technologies may shape those who use them, but humans created these artifacts, and humans can—and do— work around them.   more...


Last week I wrote about the curious case of traditional love narratives in the face of online dating. In short, the profiled format, pay structure, and overall bureaucracy of online dating throws into stark relief the constructed belief in a fateful meeting of souls. And yet, the narrative persists. Here’s a brief snippet:

…[T]he landscape has drastically changed but the narrative, not so much. The maintenance of romantic love as a cultural construct, personal striving, and affective embodied response to courtship rituals speaks to the resiliency of normative culture and its instantiation through human action. Even as we transact and negotiate romantic relationships; even as we agree upon terms; even as we screen partners and subject ourselves to screening; we nonetheless speak of butterflies and hope for magic.

In the case of love and online dating, the narrative is both highlighted and strengthened through its empirical contradiction.

This idea sparked an interesting conversation among the Cyborgology team about how this principle—constitution through contradiction—is theoretically useful in understanding the relationship between technologies and culture. Technologies reflect cultural realities, but can also expose the constructed nature of these realities, threatening their taken-for-granted logic and concomitant guidance over behavior and interaction. In the face of such a threat, however, the logics remain, and even strengthen. more...

thank you Ian Bogost for making this image for me
thank you for this, Ian Bogost

Sometimes it feels that to be a good surveillance theorist you are also required to be a good storyteller. Understanding surveillance seems to uniquely rely on metaphor and fiction, like we need to first see another possible world to best grasp how watching is happening here. Perhaps the appeal to metaphor is evidence of how quickly watching and being watched is changing – as a feature of modernity itself in general and our current technological moment in particular. The history of surveillance is one of radical change, and, as ever, it is fluctuating and rearranging itself with the new, digital, technologies of information production and consumption. Here, I’d like to offer a brief comment not so much on these new forms of self, interpersonal, cultural, corporate, and governmental surveillance as much as on the metaphors we use to understand them.



#review features links to, summaries of, and discussions around academic journal articles and books.

Today, guest contributor Rob Horning reviews: Life on automatic: Facebook’s archival subject by Liam Mitchell. First Monday, Volume 19, Number 2 – 3 February 2014 doi:

If, like me, you are skeptical of research on social media and subjectivity that takes the form of polling some users about their feelings, as if self-reporting didn’t raise any epistemological issues, this paper, steeped in Baudrillard, Derrida, and Heidegger, will come as a welcome change. It’s far closer to taking the opposite position, that whatever people say about their feelings should probably be discounted out of hand, given that what is more significant is the forces that condition the consciousness of such feelings. That approach is sometimes dismissed as failing to take into account individual agency; it’s implicitly treated as an affront to human dignity to presume that people’s use of technology might not be governed by full autonomy and voluntarism, that it’s tinfoil-hat silly to believe that something as consumer-friendly and popular as Facebook could be coercive, that the company could be working behind users’ backs to warp their experience of the world for the sake of Facebook’s bottom line.

Mitchell is not so overtly conspiratorial in this paper; more...