This is the first in a series of autobiographical accounts by Cyborgology writers of our early personal interactions with technology. Half autoethnography, half unrepentant nostalgia trip this series looks at what technologies had an impression on us, which ones were remarkably unremarkable, and what this might say about our present outlook on digitality.

800px-IBM_PC_5150 I wish I could say it was love at first sight when my Dad brought home what I just now leaned was called an IBM 5150. According to IBM, “ it was dramatically clear to most observers that IBM had done something very new and different.” I guess I wasn’t most observers. My parents say I liked it but my memories of it little to do with it being a computer per se. It was inculcated in major events in the household. It could make grayscale banners and quarter-page invitations, letters to pen pals and family. Nothing about that computer, for me, had to do with programming. In fact, what I remember most about it was how mechanical it was: All the different, almost musical sounds it made when it was reading a floppy or printing something on its included dot-matrix printer. The spring-loaded keys on its impossibly heavy keyboard made the most intriguing sound; when all ten fingers were on that keyboard it sounded like a mechanical horse clacking and clinking. My favorite part of the computer was when you’d turn it off and it would make a beautiful tornado of green phosphorus accompanied by a sad whirling sound. It sounded like this almost-living thing was dying a small death every time you were finished with it. I loved killing that computer.

That was the only home computer I had until about the 7th or 8th grade when something amazing happened and the price of computers plummeted right into my families’ price ranges. Before that, the only computers in my life were at school or at my grandmother’s house. Growing up in South Florida meant your schools were perpetually overcrowded, underfunded, over air conditioned, and understaffed. To have a classroom with a working computer in it, let alone two or three, was nothing short of a miracle. Often times you had a computer that just didn’t work. It just sat in the corner of the room as some sort of alter to a free market god that —if appeased correctly— might bless these children with the capacity to innovate and/or produce well-typed letters for our future bosses.

That god was appeased (I assume) by trips to the computer lab. Those were strange and frustrating times because you were plopped in front of this machine to do one boring task even though you knew could do seemingly anything. (This included a mythical and mysterious pleasure-giver called “Oregon Trail.” If you were lucky and sneaky enough to get to this program, so we told each other, you could shoot things and name characters after people you did not like who would then get horrible fatal illnesses.) I remember preferring ClarisWorks over Microsoft Office and I would never know a love like I had with Hyperstudio until I visited my grandmother and her Compaq Presario which ran a Crayola-branded drawing application that let me draw countless rocketships. That was a very important computer in my life.Image1

Computers were always things that helped me make stuff. I couldn’t care less about actually programming the things. The hardware was interesting though. I always wanted to build my own computer but the closest I ever got was upgrading the memory and installing a DVD-RW drive. One time I replaced a hard drive. I should say more about that hard drive.

In high school I fell into a bad crowd, which is to say, I started hanging out with computer nerds. Instead of doing reckless things that would get me in trouble but ultimately form some very healthy boundaries about what is achievable in life, I spent my summers inventorying thousands of computers and troubleshooting hundreds in preparation for the upcoming school year. During the year I staffed the school’s “help desk” which empowered me to roam the school at all times immune from hall passes. I was allowed to do almost anything so long as I claimed it was in the name of fixing a Powermac G3.

Every student had to wear their student ID around their neck on a lanyard. It was a quick and easy way to process students in a school of about 2000 that was originally meant for 500. All the IDs were the same except for a handful of clubs and organizations that got special codes which acted as talismans in the hallways. The guards left you alone and the administrators never asked where you were going. I even bought coffee from the vending machine in the teachers’ lounge. I was drunk on power. My diplomatic immunity opened too many doors. One such door was the room that housed all of the security monitors. I was asked to replace a monitor in this secret, high security room. I felt like the world’s most boring secret agent but it was wonderful. I had, as Gramsci might say, made a decision to serve high school hegemony in exchange for being exempted from its worst features.

It was sometime after fixing the security cameras that the school’s resource officer offered to pay me real money to fix his computer. I gladly did it, poorly, and for a Geeksquad amount of money. I was over the moon. I had gotten paid to do something I liked. It was a new and unique sensation that I’ve been chasing for the rest of my life. I think that might have also been the first time I thought about the thing that I’d eventually learn to describe as “social capital.”

1993_mercury_topaz_4_dr_gs_sedan-pic-6603My first at-home Windows machine was a frankenstein computer made by my mom’s engineer friend. It ran Windows 3.1 and, as far as I was concerned, was largely a Sim City device. I also loved organizing things into folders and changing the color theme. I was a weird kid. This was also the first computer that gave me regular access to the Internet. Or, to be more specific, AOL. That computer was eventually stolen and replaced with another frankenstein. My first corporate computer was a Dell XPS T600R. It could read and write to CD-RWs which, at the time, meant that I absolutely needed to do two things: 1) back up every terrible piece of fiction I had written thus far and 2) burn MP3 CDs of my pirated music to play in my 1993 Mercury Topaz. I played so much Talking Heads in that Topaz.

That was also the computer that introduced me to obsolescence. I was bragging about the computer’s roomy 12gig hard drive to the kid sitting next to me in “web design class” but he just cocked his head, squinted his eyes and said, “why is your computer such a piece of shit?” That was harsh. I should also mention that “web design class” never taught us web design because the room didn’t have a working internet connection. The class consisted of memorizing HTML tags and practicing typing on whited-out keyboards.

The first time I ever looked at porn on the internet was on an iMac, my first chatroom was on that Dell XPS. I made a web site for my fiction writing using Microsoft Frontpage before switching to Macromedia’s Dreamweaver. I wrote shitty fiction and tortured LiveJournal posts on all of my home computers. I had different AOL user names depending on which parent I was staying with at the time. That meant the online identity I cultivated on weekdays, everything from bookmarks to contact lists, was different from the one I used on weekends and holidays. Sometimes I’d make a brand new one and “surf the web” as a stranger.

Of course, as a cis-white male, I was always reminded that these devices were made for me. That the love of technology was an easily obtainable social norm for the dumpy, socially awkward teenager. Even if you lost the election to the web design club you founded (I lost to the guy that eventually invented Grooveshark so, in retrospect, he was much more qualified) there was no reason for shame. The world was increasingly mediated and controlled by computers and so masculinity was quickly conforming to that new locus of power. I’m still not sure if I’ve ever felt more personally satisfied than when I convinced my high school’s IT coordinator to give me the network’s master password. I was never so surprised by the success of my work than when a last-minute all-nighter produced an award-winning web site about alternative energy. (I won a Dell Axiom PDA for that, which I sold on eBay to buy Star Trek merchandize.)

Sometimes my mom would let me borrow her beeper so I could hang out at the mall with my friend while she was at work. I wore that thing so prominently and checked it unnecessarily so often that I wore out its only button. My first cell phone was one of those indestructible Nokias. The cover was a silvery-blue and I always had plans to get a different case at the mall (maybe something with a ska band on it?) but those lofty goals never materialized. The cell phone was rarely used for friends. It was mainly a logistics tool for my parents. I don’t have any data to back this up, but I have to believe that divorced house holds greatly contributed to the rapid adoption of cell phones: You could call your kid directly with little-to-no chance of having to talk to your ex. Brilliant.

150px-Nokia_3310_blueTo date I’ve had seven cell phones and each purchase was made after careful research and an unflattering amount of product review reading. My last three phones have been iPhones but only begrudgingly so (I have small hands and can’t stand most of the Android versions of important apps). Cell phones have always meant a lot to me, even though I hate talking on the phone. I find myself imprinting a small portion of my love for people onto the device that connects me to them. When I switch phones I get a pang of nostalgia. Not for the phone itself, but for the news I got on it. The anxious moments I stared at it waiting for a crush to text me; the bizarre friendship I made with someone who also owned the Motorola PEBL; the phone I used to tell my parents I was engaged. These are intimate moments that are about people, but are mediated through these tiny devices. Its like missing your first car or your shitty apartment right after college.

I’ve had nothing but Nintendo consoles until I got a PS3 in 2007. My Nintendo was a hand-me-down. The Super Nintendo, Nintendo 64, and Game Cube were all holiday presents. I bought a Wii for my fiancé two Christmases ago. Unlike cell phones, I can’t say I have many evocative memories of video games except that I strongly associate the smell of Febreeze to Zelda: Ocarina of Time and the Powerglove never fucking worked. Ever.

Video games were like novels, something I consumed alone. Something you might talk about with friends but never experienced communally. Video games never offered me an entrance to a community. That was always something that computers and their attendant culture provided. I never called myself a “hacker” but I certainly loved the idea of controlling something so important and ubiquitous.

When my school was selected for a pilot program where each student got to rented a brand new iBook I got to work with some Apple sysadmins that told me what a prosumer was (I was a prosumer and I didn’t even know it!) and then showed me how to use Quicksilver. My 16 year-old-self looked at that Apple engineer like he was a priest of some religion I never knew I wanted to be a part of. It was exhilarating and that’s why I have a soft spot for people who say they have found community online. Sure, your community might be predicated on a constant global supply chain of rare earth metals and cheap energy but that feeling of belonging has the uncanny ability to make you justify nearly anything.

Which is not to say I was part of some marginalized group. Far from it. Instead I was experiencing the boundless joy that comes with that marginal increase in social acceptance. The world was made for me, and yet it took that little extra interest in a Unix based operating system that made me feel like I really belonged. Then came all the toys marketed and designed just for someone with my life experiences. It was exciting to have this kind of cultural cache. This distinction and fluency that opened doors both literal and figurative. I could get county level science fair awards just for building a robotic arm from a kit that I got for Hanukkah. It was socially acceptable to love lego well into my teens thanks to the first generation Mindstorms kit. I wasn’t playing with toys, I was training for a job at GE. So much of society insulates young white men from seeing just how incompatible the world is for just about everyone else. I try to hold onto these memories of high school belonging as a reminder of just how enticing and comforting white supremacy really can be. Which is not to say I was a white supremacist but I certainly loved, participated in, and defending a white supremacist status quo. When so much of the world is alienating, you’ll latch onto and defend even the most imperfect system that gives you that sense of belonging. his dynamic is just one of the many ways that imperialist white supremacist capitalist patriarchy divides us against our collective best interests.

I haven’t even gotten to my first social media experiences. I mentioned Live Journal and AIM, but for some reason I associate the thrill of getting my college email address so I could sign up for Facebook with something totally different than my cell phones or that IBM computer. I was an early adopter of Twitter (2007) and I’ve been on Tumblr long enough that I remember seriously using Posterous instead. None of these experiences seem connected to the rest of this story except that they all happened on the first computer I called my own: a Powerbook G4. Perhaps everything that happened on and through that computer is  best left for a totally different kind of essay.

David is on Twitter and Tumblr

#Friendsgiving on Instagram
#Friendsgiving on Instagram

Airports suck. They suck the worst on holidays like Christmas and Thanksgiving: some nearly a sixth of all Americans travel for the holiday and most of them are taking to the sky to get to leave their homes and go “back home” to some dining room that’s larger than their own. Every airport is full of government-groped travelers anxious over the possibility of missing their flight to a Thanksgiving table. For the 20-30 year-old set, Thanksgiving out of town usually means a paycheck’s worth of plane ticket plus a couple days of missed work or precious class time needed for a final exam. For many more, the prospect of taking an extended weekend is completely out of the question because most of us work in retail. As my friend Lisa wrote on her Facebook yesterday: “To fellow retail employees this holiday: Godspeed, we can do this.” Thanksgiving isn’t a time to relax, its a time to either gear up for a 12-hour work day or spend as little as money as possible to make up for the remarkable food bill you just racked up. To leave town on Black Friday’s Eve is near-impossible, and so many millennials plan for a Friendsgiving: the thoroughly post-modern holiday that celebrates a  paradoxical mixture of just getting by, the excesses of late-capitalism, and the infinitely negotiable non-familial ties that make up young peoples’ lives.

The make-do nature of Friendsgiving does not mean that its any less meaningful or problematic than traditional Thanksgivings of yore. As Aaron Bady writes: “We who aren’t descendants of the original people who died so we could live as Americans, we eat the fruits of that conquest every day and it makes us who we are. It doesn’t matter if we want to, or choose to, or like it, or don’t. We still live in houses built on graves, and we rob them again every day.” Friendsgiving does not put its observers outside of colonialism. For as long as the holiday involves white people eating well in the western hemisphere, the Fourth thursday of November will essentially and irreducibly be a function of colonialism and empire. Eating with your BFFs instead of your Republican uncle does not make the holiday any less connected to the bullshit of colonialism, but we shouldn’t overlook Friendsgiving’s potential for societal change either. Friendsgiving can (although in its present form does not necessarily) represent the precarity and inhumanity of late-capitalism’s labor laws and the need for a holiday that celebrates chosen social networks over familial obligations.

Friendsgiving isn’t without its opportunity costs however. Friend circles don’t always cleave easily along crowded apartment dinner tables. You may have chosen to celebrate with your fellow bandmates but your other circle of friends is eating across town. We are uniquely qualified for such dilemmas. Friendsgiving is deeply material and thus imminently shareable. Not just the plate of food, but the congregation of close friends, the copious amounts of booze (or not!), and the collectively watched media that accents the evening’s brief respite from 60 hour work weeks. Thanksgiving crowds airports, but friendsgiving fills newsfeeds. If Thanksgiving is awkward family photos in a Pier One furnished living room, then Friendsgiving is an instagrammed Goodwill-bought plate of Butterball turkey next to your vegan friend’s amazing seitan loaf.

The holiday is a chance for constructing safe, radically accepting spaces in an otherwise hostile social environment. Friendsgiving means never having to justify your phone use, let alone your college major, your tattoos, or who you choose to have sex with. Friendsgiving is a radical act insomuch as it is a blatant choice of one’s chosen tribe over the one you were born with. It doesn’t mean you hate your parents, but it does represent a rejection of the antiquated (not to mention patriarchal) notion that it is more meaningful to break bread with one’s racist uncle than with the person that offered a shoulder to cry on after a particularly bad breakup. It is an explicit acknowledgement of the primacy of common experience over common blood. It is the beginning of a new sort of Americana. One that is so ruggedly individual that collectivity is purely voluntary and so demystified that shared ritual is nothing more than an ultimately futile escape from imposed ritual.

Friendsgiving is a conscious choice, but it can also be seen as the structural consequence of late capitalism. Friendsgiving is just as much a product of practical planning and econometrics as it is one’s personal preference. The consciousness raising potential of Friendsgiving comes from Thanksgiving’s relationship to Black Friday. Many of us can’t get away for the holiday because we’re too busy selling to those that can afford to visit family on Thanksgiving or are too precariously positioned to opt out of the retail holiday. Thanksgiving, Friendsgiving, and Black Friday are interconnected by the tissue of consumer capital, precarious employment, and the demands of corporations. The relative ease with which transportation and information systems let us pick up and move has, paradoxically, prevented us from doing just that in times of exception. We can (and frequently must) move away from our families to earn a living, but the demands of that job keep us from returning. The technologies that promised us quick and easy access to loved ones has inadvertently (but on the part of some, deliberately) made it impossible to live “traditional” lives. Engineers are the unwitting social radicals (PDF) that afforded this immense social change.

There is scant chance that Thanksgiving will be called “Friendsgiving” in 20 years’ time. We will still call it Thanksgiving because no matter who we spend it with, that’s what we’re doing. Giving thanks for what we have and the opportunity to express that thanks in the way of our choosing. That might be with family, but it might be with friends. The infinite interpretability of Thanksgiving makes it, as Aaron Bady observed, “so established, so settled, so natural, that it doesn’t need to speak itself; you can totally ignore it, and it lives on, undisturbed.” It is a harvest holiday that, 800 years ago, was probably celebrated in the same pragmatic way that millennials celebrate it today: as the last reprieve from a grueling winter full of work and tribulation.

David is on Twitter and Tumblr

File this one under “what is at stake” when we talk about the digital dualist critique. Bitcoin, the Internet’s favorite way to buy pot and donate to Ron Paul, hit an all-time high this week of around $900 to one Bitcoin (BTC). The news coverage of Bitcoin and the burgeoning array of crypto-currencies (according to the Wall Street Journal there’s also litecoin, bbqcoin, peercoin, namecoin, and feathercoin) has largely focused on the unstable valuation of the currencies and all of the terrible things people could do with their untraceable Internet money. What hasn’t been investigated however, is the idea that crypto-currencies are somehow inherently more “virtual” and thereby less susceptible to centralized control the way US dollars, Euros, or Dave & Buster’s Powercards are. Both assumptions are wrong and are undergirded by the digital dualist fallacy

First, a few prerequisites and disclaimers. I am not a financial advisor, nor do I consider myself an economist. Therefore I’m mostly sticking to what i know about the creation of value and the organization of work. I’m not an avid or practiced user of Bitcoins, but I think that outsider perspective is more useful than detrimental. I don’t want to spend too much time going over the mechanics of mining and exchanges, so for the rest of this essay I’m going to assume the reader knows how Bitcoins are “mined” and how they are exchanged for state-backed currencies. I should also disclose that I have made a non-insignificant amount of money by totally forgetting that I bought $10-worth of bitcoins at Mt.Gox in February of 2012. I cashed all but 0.05BTC earlier this week, but don’t plan on reinvesting. I also feel like an asshole for making money doing absolutely nothing.

The Bitcoin Logo
The Bitcoin Logo

As my disclosure above implies, Bitcoins have very IRL implications. They can buy things that are the product of real humans’ labor and be converted (although maybe not directly) into just about every state-backed currency in existence. This is a classic example of what PJ Rey means by “The Myth of Cyberspace.” The Internet —and by extension, BItcoin— does not reside on some other plane of existence. Just like any other currency, Bitcoins are equal parts material and ephemeral; they can be affected by natural disasters, ideology, and commodities markets. They exist as heavily protected bits that are ultimately meant to be converted into atoms of sustenance and extravagance. To call Bitcoin a “virtual” currency is to fetishize its conceptual origin and distribution system. Granted, there aren’t quite as many physical representations of Bitcoins as there are other currencies, but that distinction seems to be quickly diminishing. More importantly though, Bitcoins are subject to many of the same social structures and phenomena that keep capital unevenly and unfairly distributed. Calling Bitcoins “virtual currency” is nonsensical because all currencies are virtual in that they are “collective hallucinations” about measurement of worth, and they are all equally physical because they are held, exchanged and produced in very tangible ways with equally tangible consequences.

Currency itself does not hold inherent value. Even gold, as Marx observed, does not have some natural or inherent property that makes it equal in exchange value to the products of human labor. Bitcoins are attractive to some people because Bitcoins’ measurement of value cannot be altered by a central bank or government. In other words, Bitcoins aren’t subject to value manipulation (i.e. inflation) by obscure bureaucracies with their own private interests. What I think is overlooked here, is that escaping direct control of one bureaucracy (the Federal Reserve or the Chinese government) does not mean you have separated yourself entirely from that organization or from the influence of similar bureaucracies. The fetishization of the “virtual” aspect of Bitcoins goes a long way in obscuring the possibility that Bitcoins could be manipulated. The promise of Bitcoin’s stability is strengthened by our belief that the network is immune from old social structures; that a decentralized computer network determines an isomorphic social organization.

Bitcoins came into this world always already unevenly distributed. They circulate over and through a landscape of uneven technological fluency and access to computer networks. They must be either bought with existing currencies or mined using hardware made through human labor. Electricity and silicon, Bitcoins’ constituent parts, are still controlled by private bureaucracies with their own private interests. Before those Bitcoins could exist, someone had to spend some other kind of currency to buy the computer and set it up to start mining. There’s also the cost of the Internet connection (to my knowledge, no ISP accepts Bitcoins as payment) and the opportunity costs of learning about and setting up your Bitcoin mining operation. These are not insignificant costs. The Bitcoin mining network is faster and more powerful than the top 500 supercomputers in the world. Combined.

A Bitcoin mining operation (Source)
A Bitcoin mining operation (Source)

All of this dedicated silicon is subject to labor and energy costs. Bitcoin’s valuation, conceivably, could be manipulated by the holders of the means of production. Just like the flow of dollars from the Fed, labor and energy markets are subject to organized manipulation. Its not hard to imagine a scenario where a person builds a million-dollar mining operation and selectively hordes Bitcoins to control their value relative to the cost of mining. Manipulating Bitcoins for fun and profit seems just as likely as controlling oil reserves or adjusting interest rates. Currencies are meant to be tools of mass control given that, historically, money was “whatever the king was willing to accept in taxes.” Saying you’re making a democratic currency is a lot like saying your building a non-lethal weapon: the damage might not be as severe or predictable, but its still meant for controlling and hurting people.

Bitcoins are undeniably good for masking your identity. The promises of delivering a “virtual” currency free of private interest manipulation however, do not hold up. If we’re concerned about individuals’ decreased purchasing power at the level of currency, then perhaps we should question the usefulness of currency itself. Instead of seeking out new forms of value measurement, we should reconsider the usefulness of measuring value at all. Does everything need to be assigned a specific value, or are there radically different ways of exchanging goods and services? Can our digital connections afford new ways of trucking, bartering, and exchanging such that huge numbers of people are not subject to the whims of powerful entities? I think its possible, but not with crypto-currencies.

David is on Twitter and Tumblr

After seeing today’s XKCD (above) I sort of wish I had written all of my digital dualism posts as an easy-to-read table.  I generally agree with everything on there (more on that later), but I’m also pretty confused as to how Randall Munroe got to those conclusions given some of his past comics. I can’t square the message of this table with the rest of Monroe’s work that has maligned the social sciences as having no access to The Way Things Are. The table is funny specifically because the social scientists he pokes fun of, did a lot of work to make those answers plainly (painfully?) obvious. How does someone with an obvious resentment for the social sciences, also make a joke about how we were always already alienated? 

I don’t expect an artist to never contradict herself or himself, nor am I expecting some kind of universal and all-encompassing Theory of XKCD that would put each comic in relationship to the other. But, to my mind, I have a hard time believing that someone can believe all of the following to be true (in no particular order):

 

 

 

 

It doesn’t seem as though Munroe is a reader of social theory or cultural criticism. If anything, he seems to be an ardent critic of the “soft sciences.” Lots of his comics (especially those last three) imply that while scientist are quirky, sometimes to an obnoxious fault, social scientists and humanities scholars are consistently bullshitting their way through the profession. And yet, Munroe seems to value the observations of, and even engage in popular sociology of technology. I’ll admit I’m not an avid reader of his blag, but my keyword searches haven’t come up with something approaching an answer. I’m certainly not looking to dedicate an entire blog to critiquing XKCD I have secretly always resented physicist for making a community around, among other things, poking fun at my chosen profession.

Given his seemingly contradictory work, I (and many STS scholars) would be really interested to know Munroe’s thoughts on The Science Wars. Were you unaware of an entire shadow war that took place across multiple disciplines’ journals for several years in the mid 90s? Well that’s disappointing, it was really exciting. Thankfully, Wikipedia has a really excellent summary of the whole war, but here’s the really short version: As soon as sociologists and philosophers began to suggest that objectivity is a social construction, scientists began to push back claiming that postmodern theory was useless to society. The war hit its peak in 1996 when Alan D Sokal and Jean Bricmont wrote some nonsense under a pseudonym and got it published in Social Texts. On the day of publication they also released a public letter exposing their work as a hoax. Derrida got really angry and said some mean things in his book Paper Machines.

Randall Munroe’s work seems to both represent the most hardline Sokal supporter, while also making the social critiques of a postmodernist. The carefully cultivated “view from nowhere” of the scientist certainly supports the notion that mathematics are “pure” and sociology is superstructure, but where does the critique of alienation come from? Is it through sheer force of intuition and observation? Or has Munroe slipped into the same kind of privileged and myopic vision of society that N+1 editors periodically fall into? A perspective that projects one’s own wit and cutting criticism onto the world writ large; thereby simultaneously demanding that those witticisms go away while also demonstrating that they are more necessary than ever. In other words, Munroe utilizes critical sociology to make the point that critical sociology is useless.

It would be silly to demand that someone who draws comics should start reading social theory to do their job better. But we’re not talking about Garfield here, we’re talking about a comic that sits at the overlapping center of nerd culture, big science venn diagram. XKCD is just as much a part of young engineers’ and scientists’ training as Calculus 101. The comic needs to take social science seriously not for the sake of social scientists, but for the mutual advancement of all professional fields.

Also, it would just make for a better comic. Like I already said, I generally agree with the sentiment of today’s comic, technology does not unilaterally “make us” anything, but that’s a rather prosaic observation. It says more about the the sorry state of technology writing to think that this table is witty. Sometimes teens won’t have sex with a technology. Sometimes they will. The point of smart critical theory about technology is seeking out causal factors,  making observations about the mutually shaping relationship of the social and the technical, and sometimes providing prescriptions for a better sociotechnical world. Munroe is having it both ways: laughing at the people that deconstruct science and technology while doing it himself and claiming its nothing more than common sense. The problem being, common sense is constructed, and its heavily influenced by the critics of culture, society, and technology. If Munroe wants to make these jokes, he should at least pay a little gratitude to the profession that works to make those jokes possible. Or, at the very least, stop pretending that we’re all just making this stuff up.

David is on Twitter and Tumblr

EDIT: This post originally said Sokal and Bricmont published under a pseudonym. They actually used their real names and their status as prominent scientists to get it published.

Just about every social media network that relies on voting has more men than women in their user base.
Just about every social media network that relies on voting has more men than women in their user base. Graph from pingdom,com

The merits of voting[1] have come under scrutiny as of late, thanks in part to Russell Brand’s comments on the topic in his guest edited edition of the New Statesman. (Oh and I think there might have been an interview as well.) I’m highly suspicious of voting as well, which is why my ballots are mostly blank except for the one or two things I think might be strategically useful in later direct action. I voted earlier this week in a local election because my city is still small enough that there are very real and tangible differences to electing one counsel person over another: One city council person authorizes citizen working groups to organize municipal composting while another led the charge to close an indy media center that hosted an Iraqi artist because… terrorism.[2] A lot has already been said about the efficacy of voting and why it alone cannot possibly bring about the fundamental change that politicians promise. Besides, if you’ve read your Zinn, you know that all the important stuff happens between elections anyway. What I want to touch on today however, has less to do with government elections, and more to do with the abstract concept of voting. Why is it that, if voting is implemented within a system, do we automatically assume that it is more democratic? What happens to social networks and web platforms when we install voting as the overriding system of displaying public opinion? Why shouldn’t the critique of voting in general be directly imported as a critique of the social networking sites that use voting as the primary form of interaction on the site?

Strict up or down votes are a relatively recent invention. They were first and most widely used in armies that needed to choose a new leader from amongst themselves.[3] This makes sense if you consider what voting actually means: choosing among several discrete options with the full expectation that some people —sometimes even more than half— will feel as though they did not get their way. Unless they’re the product of consensus and comprise, group decisions must be enforced through sanctions or punishments. Organizations like militaries have the strict chains of command and ready access to deadly force that are necessary for such decision-making. If you ask most radical leftists why they don’t vote its mainly for this reason. Not only are the limited options woefully inadequate, but the very idea that one side loses while the other side wins is undergirded by the promise of violence.

Is is disturbing then to think that voting —at least in the West— is widely perceived as not just a governmental process, but as a synecdoche for democracy. Sometimes other actions are rhetorically transmuted into “voting” so as to imbue some larger social structures with democratic features. We are said to “vote with our dollars” as consumers under capitalism, and we “vote with out feet” when we abandon a declining neighborhood. Watch a People’s Choice Award acceptance speech and you’ll hear a lot about what “the people really want” in their musicians and entertainment. Given the underlying structural violence that keeps capitalism in place, perhaps phrases like “voting with your wallet” are actually more accurate than most people realize.

Social media sites that heavily employ voting as part of the user experience often go heavy on the democratic rhetoric as well. Reddit’s vote-based system of arranging stories elevates it from “news aggregator” to democracy’s digital white savior. Alexis Ohanian, one of the founders of Reddit, has been described as the “Mayor of the Internet” and goes on speaking tours about how sites like his are evidence of the Internet’s capacity for self-governance. The users themselves are also quick to call Reddit a democracy. If a news story (or, lets be real, a cat video) ends up on the Front Page, we are supposed to take that as a message of collective, public opinion. This is an enticingly simple model and some news sites like PolicyMic have taken to implementing similar voting systems to organize comments and stories on their own site.

Curiously, sites like Tumblr or Pintrest, which are arguably no less “democratic” but far less reliant on numerically ranking content, are not so quickly and readily described as such. These sites are also —and as I will explain in a moment— uncoincidently used by more marginal populations. While the average Reddit user is a 20-something American male, just about every other social media network has a majority of women and popular sites like Twitter and Instagram have many people of color from urban counties. Its difficult to say whether its causative or simply a correlative trend but the relationship is clear nevertheless: sites that rely heavily on simple voting (Reddit, Stack Overflow, Hacker news ) have much higher percentages of male users.

A few months ago I asked whether it was possible to build an anti-racist Reddit. I suggested that,

Perhaps the design solutions necessary to sufficiently discourage racism on Reddit would make it unrecognizable. A web platform that relies so heavily on quantifiable upvotes, comments, and karma might very well encourage undesirable behavior. Things that are shocking or provocative garner a lot of attention, which almost always translates into karmic rewards. It might be worth comparing the quantification-heavy design of Reddit with the virtually number-less Tumblr interface.

The flip side of that observation is that undesirable behavior can be reinforced through majoritarian voting only when the undesirable behavior is held by a majority of people. Then again, its hard to tell if white young males are attracted to a site that is set up to reward their view of the world, or if Reddit is populated by young white males because it comports so well with what they think see as the optimal method of aggregating public opinion. Its easy to see voting as inherently democratic when you never encountered a voting system that is set up to disenfranchise you. Perhaps it is time to apply the critique of voting to our social media networks. Voting leads to more homogenized user bases and rather than encouraging different points of view, demands that one win above all others. I will treat Reddit like I treat my local ballot: use voting as a tactic to support allies and future efforts at direct action.

David is on Twitter, Tumblr, and yes- even Reddit.


[1] I recognize that there are many different kinds of voting, some of which are far better than others. Preferential, instant-runoff, and ranked voting all have their relative merits but for the sake of this post I’m referring to the kind of voting that dominates in the United States and Great Britain.

[2] For state and national elections, I’m an avowed party-line voter for the Rent is Too Damn High party. The fact that that is a joke, should disturb you greatly.

[3] See page 185 of David Graeber’s The Democracy Project (2013)

giphy-2

When I was young, Robert Stack would visit me in my dreams. His monotone voice and sharp eyes would come through my wood-paneled RCA cathode-ray childhood TV and settle in my subconscious until I went to sleep.  During the day Unsolved Mysteries was an opportunity to, “solve a mystery” from the comfort of my own home. If I watched the dramatizations closely enough I thought I might recall some repressed memory of an alien abduction or I might notice a telltale tattoo that marks the new neighbor as a relocated serial killer. Solving these Lifetime-disseminated mysteries was a sacred trust that I did not take lightly. Everything from persistent hauntings to serial killings were on my plate. When you grow up in South Florida, extra terrestrial abduction and friendly serial killers seem so plausible. If we were able to fit over a million people on a sandbar and intoxicate them long enough to stay through hurricane season, anything could happen. But no matter how much I investigated I always seemed to disappoint Robert. My neighbor with eight fingers that loved ham radio was not the man suspected of murdering two teenagers in Ohio; he was just really racist. The lady across the street was not a reoccurring spectral phenomena; she was just 90 years old. None of these people were particularly extraordinary —let alone extraterrestrial— but Unsolved Mysteries injected a sense of the enchanted in an otherwise mundane suburban landscape.

Photo c/o onceuponaproduct.blogspot.com

Night however, was different. In my dreams these aliens, ghosts and murders came out to greet me. There was the reoccurring nightmare of the knife-wielding murderer hiding in the gardenia bushes in front of my house. His body almost completely obscured in darkness except his grizzly hand and that impossibly shiny blade. There was the snowy TV flickering in the dark room and the violent shadows that would attack me with rotting body parts covered in acid. In one dream there was a lonely TV on the floor in a smoky, dark room and it would flash static and snow while Robert Stack’s booming voice asked all the leading questions to which I had no answers: “Were these random occurrences? Or was this incontrovertible evidence of paranormal activity?” “Where has Mr. Dufresne gone and what happened to his blue Chevy Tahoe?”  I wasn’t haunted by ghosts, I was haunted by the possibility of ghosts. Haunted that someone, somewhere was sending a manila envelope to a PO Box in Burbank, California with proof of alien abductions or poltergeists. It was only a matter of time, it seemed, before the aliens, ghosts, and murders found me and I would be called upon to testify-via-dramatization in front of an entire nation of daytime TV peers. But that time never came and for a while I thought that might be just as mysterious. Am I immune to spectral violence? What does it mean to never live a life worthy of reporting to a daytime TV producer?

stack-emerging

My Unsolved Mysteries unconscious headcanon was terrifying, at times rising to the level of night terrors, but I wouldn’t trade it for the world. Magic doesn’t need to be beautiful to be personally meaningful.  My fiancé and I talked about Robert Stack’s haunting voice on our first date. I imagine we’re not the only ones to bond over that childhood experience.There’s something grotesquely beautiful about those mental roller coaster rides that come to us in the night.

The show made genre-defying leaps of the TV dial fueled by its own widespread appeal: It went from primetime NBC to CBS before leaping to Lifetime and finally passing on to the afterlife of SpikeTV where it was hosted by someone that wasn’t the reanimated corpse of Robert Stack so who really cares?  And yet since 2010 when the show was mercifully canceled, reruns have been infamously difficult to acquire. Can I Stream It? feigns ignorance and suggests some shitty one-off documentary from 2010. The episodes uploaded to YouTube have all been taken down. All that’s left now are a handful of re-edited themed DVD box sets that turn my daily paranormal briefings into some kind of too-long Discovery Channel documentary. The shows that made the paranormal seem mundane, are lost. A few years ago, in a fit of morose nostalgia, I searched Pirate Bay for remnants of Unsolved Mysteries and found very little: anemic seed counts of three or four active users sharing a smattering of low-res DVD rips. Its like we all gave up on all those mysteries.

The turn of the century also saw the death of the X-Files, another show that sought to reveal hidden mysteries albeit in a very different genre. Whereas Unsolved Mysteries made dramatic reenactments of unexplainable events, the X-Files offered us a peek behind the “no comment” curtain. It’s almost as though, if you slowed down an Unsolved Mysteries reenactment, you might find Mulder & Scully running in the background. Pirate Bay user Lucy1948 noticed as much and put it in her “read me” notes for “The UFO Files” download:

The UFO Files ? though Unsolved Mysteries has endured since 1987, the producers were savvy enough to update the show making sure that UFO segments had an X-Files-esque feel to them. Robert Stack, clad in a trademark trench coat, introduces each segment with semi-serious gravity, as if he were revealing a government secret to Mulder and Sculley. [Sic]

Unsolved Mysteries and The X-Files take place in the same universe, but approach it from radically different genres. They both instill a sense of the mystic in an otherwise thoroughly modern world. Its worth noting that many X-Files episodes are based on the same source material that made it on to Unsolved Mysteries. The Taos Hum, for example, was featured in both shows, along with references to the same crop circles and alien abduction scenarios. Vince Gillian had even gone so far as to write an Unsolved Mysteries /  The X-Files crossover episode but it was never produced. They are entertaining because they give us the possibility of an enchanted world but never completely deny us the security blanket of rational science. This tension arises from what German sociologist Max Weber called “disenchantment.” All modern societies, according to Weber, replace mysticism and story telling as the predominant mode of explaining everyday events, with rational argument and empirical observation. Everything is explainable, so long as it is exposed to enough of Scully’s skepticism. It also meant that certain subjects were considered the domain of the natural sciences while others were a matter of society, culture, and politics.

ufoShapin and Schaffer catalogued this separation in their well-known book Leviathan and the Air-Pump  by studying the debates between Robert Boyle and Thomas Hobbes over Boyle’s revolutionary air pump demonstrations. It was the first time that scientific observation staked its claim as a legitimate source of truth outside of political power. Peer review (or, at least its primordial ancestor) of observable experiments, according to Boyle, was a better way of understanding the physical universe than the natural philosophy that could include concepts of morality or ethics alongside generalized observations. Whereas natural philosophy sought to place observable phenomena within larger schemas, experimental science used observable phenomena to build theories.

I don’t think Unsolved Mysteries orThe X-Files would make sense to us, let alone be entertaining or haunting, if Weber were correct. Even if the mysteries were meant to be solved and the truth was “out there” the whole point was to dwell in those precious moments before disenchantment and the inevitable ambiguity of the final act. Mulder The Believer and Scully The Skeptic might only exist in the modern dualism of nature and society, but they exist as foils who ultimately find the truth to be somewhere in-between their polar standpoints. That’s why I agree with Bruno Latour’s (in?)famous titular argument: We Have Never Been Modern. Latour argues that the rhetorical (and only rhetorical) siloing of nature and society has stunted work in both fields and a more interconnected analysis of human and nonhuman actors is necessary. The implicit upshot of collapsing the the nature / society divide is an enchanted world. Not a reenchanted world as some have argued but the same world full of specters and extraterrestrials that we always had. The X-Files and Unsolved Mysteries weren’t making these stories from whole cloth. They told modern myths in a way that they need to be told: in the parlance of science and skepticism.

It seems strange then, that in a world where we have exponentially increased any given individuals’ ability to record, document, and share their experiences, that we seem to have given up on these mysteries and x-files. The closest we might get is the wildly popular podcast Welcome to Night Vale which takes as its central premise, that conspiracy theories are real and that the banality of the supernatural is humous and disturbing at the same time. But Welcome to Night Vale, by creating a thoroughly supernatural world, relies on a rigorously disenchanted world as its foil. The supernatural hidden in the everyday just seems to have disappeared.

giphy-3

The easy (and wrong) explanation for the apparent dearth of enchantment is that smartphones make it impossible to live in ambiguity anymore; that there are too many recording eyes walking around to let us think we might have caught something in a shaky home video. Nothing, however, could be further from the truth. Documentation creates the illusion of objectivity, but never provides a window onto the real. Even in a laboratory setting, there is plenty of discursive room for ghosts and mystery. Bruno Latour and Steve Woolgar in Laboratory Life observed that several devices in the lab were dedicated to “transform[ing] pieces of matter into written documents.” These were devices that made the messy and ambiguous world compatible with the rational model of experimental science. Cells and viruses could be transmuted into the kinds of charts, graphs, and tables that make arguments and prove theories. But inscription devices are susceptible to the experimenter’s regress: they’re only as good as the theories that underly their design. If I build a machine to detect gravity waves, and it doesn’t detect anything, does that prove gravity waves never existed or did I build a broken detector?  Both are possible but the detector itself cannot resolve this ambiguity. Access to recording and documenting does not automatically eliminate mystery, if anything it deepens and complicates the mystery further. The act of documentation inculcates us in our own modern myths.

Maybe the enchanted mundane no longer speaks to us because too many unknowns as of late have become far too well known. Mulder’s fear of the government seems almost naive in the age of Wikileaks and cases like the Taos Hum are getting serious academic attention. Even the pilot for The X-Files spinoff The Lone Gunmen got too close to comfort when it eerily predicted 9/11 truther claims six months before 9/11 even happened. All of which is to say the enchanted is no longer at a safe distance and our science security blanket just seems to attract more terrible mysteries.

If there’s any heir apparent to the enchanted mundanity of the X-Files and Unsolved Mysteries it is probably the found footage horror genre and the ghost hunter television show. Movie franchises like Paranormal Activity and TV shows like The Travel Channel’s Ghost Adventures inculcate the very act of documentation as part of the paranormal (the two categories merge completely in the excellent Grave Encounters movies). Ghosts become visible when we put them behind the lens and record them on our “electronic voice phenomena readers” (aka  tape recorders with the volume all the way up). Our inscription devices have the power to show us the ghosts that we know always already existed. If you watch a ghost hunting show for any length of time you’ll realize that ghost hunters are more like Scully than Mulder. They talk about definitive proof and scientific investigation. But their inscription devices act more as ghost creators than ghost detectors. That’s not to say the ghosts aren’t real —they’re just as real as your last allergy test— they’re just highly susceptible to the experimenter’s regress.

We don’t have unsolved mysteries and x-files anymore, we have circumspect evidence, and controversial methods. We’re less interested in the blurry Bigfoot photo, and more interested in following the cryptozoologist and learning about how the photo was taken. The mystery isn’t in the subject, but in the process of capturing the subject in a scientific setting. It is the investigation itself that has become the mystery, not the phantoms, aliens, and murders that surround us. Next time you fantasize about an X-Files reboot just imagine Mulder and Scully arguing over the validity of a dash cam video: “C’mon Scully, its obviously a fake.  You can tell by the pixels. Also, Yeti don’t travel that far south.”

David is on Twitter and Tumblr.

tumblr_ll49e5SRJs1qdaaw6o1_500

apple-event

Confession: I watched the Apple event yesterday, and I’ve watched at least part of every product announcement for the last several years. Apple announcements are the opposite of a guilty pleasure; they are a burden that I take on with pride.  They are insipid and represent everything that is wrong with Silicon Valley and yet I feel obliged to watch them because they let me stare deeply into this heaving morass of Cronenbergian lust for technology. It always feels like we’re one year away from Phil Schiller offing himself with an iGun after screaming “LONG LIVE THE NEW FLESH!” When I watch Silicon Valley spread out on the Moscone Center stage I feel prideful (to a fault perhaps) that these events just seem so… transparent. They’re so easy to read and so easy to critique they amount to social science target practice.

First lets talk about the sex and gender politics going on here. The parade of white dudes showing sun-kissed blonde girls captured behind their beautiful retina displays sets the tone. Not a single woman took the stage at this event and I can’t remember a time where one ever did. It is no surprise then, that so much of an Apple event is subject to the male gaze. The unmistakable, slow motion ejaculatory climax that occurs at 1:17 of this Mac Pro assembly video is an excellent example. These machines aren’t just sexy: they are sex. The audience is encouraged to grope and gaze at these devices with the explicit promise that “you will absolutely love this.” The devices are simultaneously masculine and feminine, but their queered identity never transgresses normative sex politics. When we want to treat the iPad as an object of desire or a subservient assistant it can be a woman. When it is a “killer” machine capable of productive work the presentation shifts to talking about the collaboration between the masculine user and the efficient machine. Did we mention stuff goes in and out of the ports faster than ever?

The efficiency and glamour of Apple products is also displayed through equal parts Platonic essence of computing and Puritanical cleanliness. No one’s desktop looks as clean and orderly as the demo machines on stage. Our content is never that polished or interesting. Emails are always invitations to sushi and never your phone bill showing your data overage charges. A demo never contains a bad photo of your cat or a tedious expense report. The ideal Platonic form of computing demoed on stage is sexualized but never intimate. We’re seeing the aspirant minority report computer, not the cozy workstation. The iPad is the only divergence from this trend. The “magical and revolutionary device“ was first demoed in a lounge chair. The reclining Steve Jobs described it as “more intimate than a laptop and more capable than a smartphone.” The very first thing he does on it is read the newspaper. He suggests that the tablet might be sitting in the kitchen, waiting for you to pick it up and buy over-priced movie tickets from Fandango.

Framing the machine in the world (rather than the machine framing the world as Heidegger might have it) is essential to giving it distinction and enrolling it in the target consumer’s habitus. Here is a list of people and things that were used as subjects for product demos in this last event:

  • Gilt,
  • MLB,
  • Wives making fun of your clothes,
  • American Express,
  • Wall Street Journal,
  • Mars,
  • Walking to San Francisco’s Coit Tower on a sunny day,
  • The video editor that did Independence Day
  • Photographer for National Geographic & Sports Illustrated
  • Music producer for Lady Gaga, Madonna, and The Killers
  • Skateboarding in (what looks like) Southern California
  • Using the word “killer” to describe something as “cool.”
  • A software-based drummer named “Kyle”
  • Wind farms

Each one of these isn’t singularly the territory of white middle class America, but taken together they seem to form the unmistakable outline of a Cool Dad from San Diego. Now that Apple events are a “thing” they no longer have to establish products as part of a particular class distinction. Rather they refine and reify their standing in “cool.” The demo isn’t so much a demonstration on how a product works, but rather a demonstration that the product on display is part of a life always aspired to but never totally lived. You are not a National Geographic photographer, but the Mac Pro is the kind of device worthy of someone aspiring to that kind of success. The demo is only affective if you relate to, if not simply understand, the process of using your Amex card to buy something on Gilt. This sort of distinction isn’t necessary for enjoying or using Apple products, and I don’t want to imply that iPhones are a “white person thing” but rather, part of an institution of whiteness. That is why, as Ayesha A. Siddiqi tweeted, “a white centric society will make anyone an expert on the white experience if you feel implicated in an unjust system it’s bc we live in one.” One has to learn to at least read and make sense of this whiteness in order to assess the quality of the product.

Nowhere does capitalism show its absurd paradoxes than in Apple’s reality distortion field. It is a place where Bono can own a primary color and use his profits to play white savior in Africa. It’s where Apple, the most cash-rich company in the world, can still paint itself as the underdog by showing unattributed critical quotes about how the iPad is for “tools.” A sense of embattlement is crucial for maintaining loyal followers. Evangelical Christianity and the Republicans Party are excellent at it. Denying your own success while simultaneously maintaining that you have the superior idea works equally well for the belief that Jesus Christ was literally the Son of God, and that chamfered edges are really cool looking. This balancing act of economic success and philosophical superiority is essential for staving off the inevitable “so over” phase of any trend. There’s nothing surprising about how Apple Events are scripted and why should there be? It is, after all, the ultimate accessory to capitalism.

David is on Twitter and Tumblr. He also just redid his website davidabanks.org

We have a two-month break from self-inflicted government crisis, so let’s use it to take a breather, assess the situation, and cast some shade on rich people. Not because it is cathartic (it is), or because it will prevent the next crisis (it won’t); rather, I think studying the contours of the government-shaped hole of the last three weeks can teach us something about how Silicon Valley views public ownership. This is important because we typically use metaphors[1] like “the commons” or “the public” to describe their products. These words imply a sense of trust, if not mutually assured disruption: sure a rich guy might own Twitter on paper but it becomes worthless if everyone stops treating it as a (if not the) center of daily life. What do the people that own these service/spaces think about the de facto collective ownership of their product?

First off, I don’t think anyone is surprised that professional rich person Jason Calacanis said this:

Nor do I think anyone is surprised that people like Valleywag contributor Sam Biddle, who still has a grip on reality, replied:

Which, of course, prompted some back peddling:

And concluded with Sam being called an anarchist, and him firmly denying it:

As someone who would, on the record, call himself an anarchist, I’m endlessly fascinated by this conversation because 1) I am legitimately impressed by how fast Jason was able to parlay his TED Talk snarking into geopolitical concern trolling, 2) I think government and governance are being conflated in interesting ways, and 3) this is a picture-perfect view of Silicon Valley’s relationship to government and it is straight-up Reagan era BS. I’ve said all I feel like saying about 1 and I’ll get back to 2 later, so lets talk about 3. Silicon Valley thinks government is the problem, and the solution can be found in their app marketplace. This is what Evegny Morozov called “solutionism” in his latest book To Save Everything, Click Here. The general premise –that not all social or political problems can or should be solved by a technical fix—is spot-on, but I think there is more going on than a superficial fixes to problems best left untouched.

The stuttering and stopping of the American federal government is both deliberate and, ultimately, has unpredictable long-term consequences. The identification of government not just as a perpetrator of daily problems but as the problem of American society (while somehow also going unnoticed when it “shuts down”) is neither coincidental with the rise of a radicalized right, nor is it causally detached from the increasing power and wealth of CEOs.

As an anarchist, its fascinating to see what is labeled “the government” and what is assumed away as some kind of naturally occurring phenomena upon which a libertarian utopia can be built. In other words, there is no Silicon Valley solution to building roads, but there are plenty of apps for reserving rides that ride atop those roads. The hyperloop is a neat idea, but it will not replace the roads that make up city blocks.

More insidiously, as Silicon Valley’s penchant for corporate subdivisions and campuses can attest, these companies are part and parcel of the divestment in publicly held goods and services. They can dodge taxes and demand legislative loopholes with the best Wall Street financial firm. Moreover, there’s an immense and endlessly captivating amount of self-deception in an industry that claims to be the place where “value is created,” yet contain high-profile companies that have never, ever turned a profit. Ever.

The idea of success and value first, and profitability second, is absolutely genius and builds massive companies that can effect millions of people’s lives without ever coming up with a fully-baked profit-making strategy. It’s sort of like what the government used to do. The Post Office, municipal water and electricity, even NASA are examples of government using its economically extraordinary position to establish a service at a loss and eventually find the right mix of economies of scale and end-user products that could turn a hefty profit. The city I live in still makes money off of its municipal water supply, the Post Office would be grossing over a billion dollars in profit if it weren’t for a 2006 congressional mandate meant to hobble the largely union-organized employer, and NASA isn’t much more than a venture capital clearing house for aeronautics.[2]

Silicon Valley venture capital companies have hit upon the sort of economic model that literally builds nations and they know it. As Sam Biddle aptly put it,

The public is a competitor—so when it freezes up, thank God for that. It’s the same instinct that makes Sarah Lacy squeal blog enthusiasm when transit workers are on strike: the problems of humanity are signs that the Silicon Cult has it right. Thinking of anyone but yourself and your network is so Web 1.0. Their new way of life—to rely on a couple years at Stanford and friends in high places for a quick payoff and a stab at novelty—is just superior. Any communal impasse, any shared suffering, is just a chance to stand out with their money while they still have it.

As I’ve said before, I don’t think open source is the obvious or apparent way out of this mess. Open source alternatives to proprietary social media networks are just that: alternatives. They are not the default, or the obvious best choice in the same way an anarchist bookshop will never supplant the mall. To ask the former to replace the latter is to misunderstand the appeal of either one. We either need a society that no longer desires private and corporate common space, or open source communities need to build something completely new that’s better than the corporate standard. It is only in this theater of proof that these assholes will have to make a run for their made-up money.

David is definitely still on Twitter and Tumblr.

 


[1] It is crucial that we remember that spatial metaphors are just that: metaphors. More importantly they are deeply imperfect metaphors that over-determine the technology’s ability to overcome or supersede social phenomena. PJ Rey has the definitive take on the Myth of Cyberspace.

[2] For example, the Space Shuttle program was, since the mid 90s, administered completely by Lockheed Martin and Boeing, through a privately owned LLC called United Space Alliance.

up-41435e2014e71d77143bf31e27e27801

I have watched my fair share of Upworthy videos. They’re generally fun to hate-watch, and they make for good Newsfeed fodder. Sharing Upworthy videos with your “Family” or “High School Friends” Facebook list can make you feel like a prime time MSNBC anchor. Each video is an opportunity to reveal something to your assumedly uninformed, selfish friends. The leading, absolutely begging to be parodied titles range from confusing (You Should Watch This Strange Man Rub A Stick Of Butter On A Tree. For A Really Good Reason) to the cloyingly heinous (Obama Takes A Second To Talk About Jews In America. It’s MEGA Inspiring). These could be dismissed as cludgy rhetorical tools for Facebook arguments, but there’s something else about these videos that is actively destructive to the American left. Upworthy packages soundbites of elite white paternalism for mass distribution and consumption through social media.

I started off by implicating myself in distributing Upworthy videos because I want to highlight that I am not immune to the allure of “Fox News Anchor Forgets He’s On Fox News, Speaks Like A Real Human Being With Feelings And Stuff” So many of the videos are fueled by sheer schadenfreude; laughing at Fox News’ anchor androids has become a veritable hobby of mine and Upworthy certainly makes it incredibly easy to do. Different kinds of videos appeal to different people, for a variety of reasons. I get that. But the recent Malala Yousafzai interview on the Daily show is just too fine a distillation of what is wrong with Upworthy to ignore.

Malala is a brilliant person and up until the point Stewart asks to adopt her, (not his finest moment) it is a powerful firsthand account of what the Taliban have done and continue to do in the Swat Valley. At the same time however, it is a perfect example of how the mainstream progressivism is shot through with an obnoxious and paternalistic brand of white supremacy and cultural ignorance. Zeynep Tufekci said it quite succinctly: “If you think Malala is rare, that is probably because you have not spent much time in such countries. Most Malala’s, however, go nameless, and are not made into Western celebrities.” Assed Baig goes further by describing how this widespread ignorance actually justifies Western aggression:

This is a story of a native girl being saved by the white man. Flown to the UK, the Western world can feel good about itself as they save the native woman from the savage men of her home nation. It is a historic racist narrative that has been institutionalised. Journalists and politicians were falling over themselves to report and comment on the case. The story of an innocent brown child that was shot by savages for demanding an education and along comes the knight in shining armour to save her.

The actions of the West, the bombings, the occupations the wars all seem justified now, “see, we told you, this is why we intervene to save the natives.”

There’s an important connection between these two quotes that I want to explore by way of a seemingly unrelated set of literature. Science studies scholars use the term “organized ignorance” to describe the systematic and institutionalized maintenance of gaps or misunderstandings of scientific facts and procedures, on the part of powerful actors. Organized ignorance can suppress activism, establish professional boundaries, and steer research trajectories. Climate change is a pretty straight-forward example of organized ignorance. A more complicated one is the consequences of widespread PCB-based plasticizers. Both are severely under-studied in relation to their potential harm.

The Daily Show has its own set of problems, as do the other corporate media outlets that provide grist for Upworthy’s soundbite mill, but the fact that Upworthy serves up content specifically for sharing on social media makes organized ignorance more resilient and more widespread. Upworthy enrolls us in the establishment of our own organized ignorance. When we (and again, I am implicating myself in this) share these videos we are reenacting the hollow and superficial conversations that, in another context, would cause us to roll our eyes and complain about the lack of critical thought in news media. We share them for all sorts of reasons, but the impact is the same: ignorance goes viral.

By organizing ignorance around race relations and international conflict, Tufekci’s observation is transmuted into the justification for violence described by Baig. Through the simplification of extremely complicated geopolitical conflicts, Upworthy makes every story into Kony 2012. As Sarah Wanenchak wrote last year,

The video presents the ignorance of the developed West as its primary sin and the primary obstacle in the way of Joseph Kony being brought to justice; it therefore implicitly offers the simple fact of “awareness” as a form of blanket solution to this problem, with the supposition that action will necessarily follow. Viewing and sharing the video therefore offer an emotionally powerful but objectively questionable experience: the sense of having taken active part in something both significantly communal and directly world-altering.

The most pernicious aspect of Upworthy is that it makes you feel as though you are changing the world through consciousness raising when what you are actually doing is replicating and enacting an organized ignorance undergirded by the banal evil of white supremacy. Instead of focusing on structural change, these videos zoom into small heroic moments that are emotionally powerful but do very little to engender a more informed debate about our shared problems, nor do they help us make smarter demands of those in power.

I don’t call myself a liberal, but I can’t say I’m outside of their political and cultural influence either. By refraining from sharing the kinds of videos Upworthy curates and disseminates I can do my part in containing this kind of thinking. But I don’t want the reaction to this essay to be an organized boycott of this site. That would be treating the symptom when the greater illness is a dearth of useful tools for generative debate and discussion. We need to aspire to a better media landscape and outspoken leftist movement that makes Upworthy obsolete.

David is on Twitter and Tumblr

Watch_This_Incredible_Young_Woman_Render_Jon_Stewart_Speechless

A rendering of Facebook's new Anton Menlo subdevelopment.
A rendering of Facebook’s new Anton Menlo subdevelopment.

Silicon is a cyborg element. You can find it everywhere, but almost always bonded to something else. Silicon is the second-most abundant element on the planet and yet you have probably never seen it in its pure form. (For the record, it looks kind of like a leftover baked potato wrapped hastily in tin foil.) Entire geographic formations are named after the element, but (and I think this might be a first for naming conventions) those places have largely nothing to do with the extraction or even refinement of that element. Silicon is a prerequisite, a synecdoche for a larger industry that demands we refine and purify this promiscuous metal into a predictable and highly controlled component. True to its namesake, Silicon Valley (not to mention Austin’s lesser-known “Silicon Hills”) is an exercise in refinement. Intricate and eclectic streets are tossed aside in favor of gleaming, modern campuses with strict access control. It is a place where functions are separated so that they may reach the sorts of optimal efficiencies that Le Corbusier promised and Moses tried to deliver. But unlike Moses or Le Corbusier, the planners and corporate patrons of Silicon Valley are making places meant to be freely chosen.

Despite the language of “synergy” and “disruption”, most Silicon Valley corporate headquarters are akin to the Jeffersonian university campus: secluded, quiet, and contemplative. A place where the outside world is intentionally buffered away so that those inside the campus could make big gambles in science and politics. In 1820 Jefferson described his soon-to-be university as an institution that “will be based on the illimitable freedom of the human mind. For here we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it. In order to make the sorts of breakthroughs that would (for example) “put a dent in the universe” you needed to be locked away with your fellow dudes studying the liberal arts and sciences.

Whereas the University of Virginia was meant to breed people that would go off to make useful things, the campuses of Silicon Valley are meant to write the patents and design the products. They are not only a place of contemplation and examination; they are also centers of (post)industry. Silicon Valley might have started life as a series of campuses, but it may very well become something very different in the next couple of years. Google has been operating a private bus system since 2007 and the Wall Street Journal is now reporting that Facebook plans to build a 397-unit housing complex called Anton Menlo for its workers right next to its Menlo Park campus.

It should be noted that the etymology of campus has more to do with the space between buildings than the buildings themselves. The Latin root words refer to fields, pastures, and common spaces. To build a campus is to be concerned with the ordered and planned interactions between components of a finite set. A campus works (in theory) because every part is custom-fitted to the other. A company town is different. A company town is an exercise in monopoly economics. An offer you cannot refuse. You live there as a stipulation of your employment contract and your employer uses the contract as an opportunity to shape you into the perfect worker. 21st Century Anton Menlo however is not a 19th Century company town like Pullman, Illinois. It differs in a few very important ways. First I want to describe a few key components of the industrial company town before discussing what I think is important and new about Anton Menlo.

When new factories would open, and word spread of a reliable employment opportunity, workers would set up informal settlements on the outskirts of the city. Much like the favelas of Rio de Janeiro today, these towns were self-organized hotbeds for political populism but lacked basic resources or safety. They caught fire and were constantly plagued with sanitary problems and concomitant diseases. Governments refused to extend services to these settlements, despite their lack of affordable housing.

Pullman, Illinois
Pullman, Illinois

When George Pullman built his new factory on the outskirts of Chicago, he decided to also build a planned town. It had shops, schools, churches, and libraries as well as housing for 6,000 employees and their families.Homes were outfitted with the most modern amenities: steam radiators, modern sewage and water hookups, natural gas fixtures for lighting and cooking and even a closed-loop waste recycling system that would fertilize crops to be sold back to workers at the company store.  Pullman also aspired to make his skilled workers just as modern as the buildings they lived in.  Almont Lindsey, in a 1939 issue of The American Historical Review, wrote:

“Desirous of avoiding labor difficulties, [George] Pullman believed that paternalism wisely administered would lull the restless yearnings of the laborer and give to his powerful corporation a stability in labor conditions not hitherto known” (P. 273, Volume 44, Number 2).

Homes were subject to regular inspection and counselors would instruct wives on proper home maintenance. For the first several years, Only Presbyterians were allowed to use the church without paying an exorbitant rental fee. Workers were required to live in the town, and once you were trained as a train car builder you didn’t have many other choices for employment except Pullman. The company continued to increase living costs and decrease wages until 1894 (about 14 years after the town was built) when a strike and subsequent boycott disrupted nationwide rail traffic. The worker-inflicted property damage and the state’s violent suppression of the strike was enough to tarnish the company’s reputation and cause the Illinois Supreme Court to order the dissolution of Pullman’s total ownership of the town. Company towns haven’t been very popular since.

The company town wasn’t meant to isolate you from society the way the campus did. Instead, it was meant to indoctrinate you into a particular kind of society. The campus is meant to free the individual from norms by providing physical isolation and the necessities of life; the company town imposes norms through the selective withholding and distribution of those same resources. Facebook’s burbclave doesn’t do either of these things. No one is required to live in the subdivision; in fact it’s only big enough to accommodate ten percent of Facebook’s current workforce and I suspect there is a good reason for the planned scarcity.

Facebook’s workers are likely to see themselves more as upper middle class entrepreneurs than toiling workers. They definitely have more material comforts than a factory worker, but it isn’t clear whether that translates into economic security. A single twenty-something working at a social media company might take home a low six-figure salary, but there will be student loan bills waiting for them. Even someone with a well paying job must still think about how they can make themselves indispensable. After awhile you might find yourself asking: Are you networking or hanging out with your friends? What’s the difference anymore

The Wall Street Journal wonders if “the downside [of Anton Menlo] could be unspoken expectations that employees always be working.” That just seems quaint. The computer industry pioneered the constantly working employee over 30 years ago. David Byrne’s 1986 movie True Stories stated it quite bluntly when the mayor of a small Texas town triumphantly exclaims to his children over dinner, “Linda, Larry there’s no concept of weekends anymore!” He is thrilled to see that the microprocessor company that just setup shop has employed so many people that cannot tell when they are working or playing. Anton Menlo not only encourages the dissolution of the work/play divide, it capitalizes on it.

Instead of the compulsory residency of a company town, or the material necessity of on-campus dormitories, the subdivision solves the problems the corporation created in the first place. By dodging taxes and competing with local businesses with their own cafeterias and in-house services, Silicon Valley companies create unappealing and expensive housing markets without public transit. As Tom Foremski at the Silicon Valley Water observes: “Living in the shadow of the Googleplex, or Twitterplex, or Facebook’s giant campus at 1, Hacker Way, is causing job losses and hurting rather than boosting the local economy.”

If the corporate campus was Jeffersonian, and the industrial corporate town was Fordist, then the Silicon Valley corporate subdivision is Gramscian. You choose to live in a place owned by your employer, so as to gain access to the benefits only it can provide. You enter into a contract and accept Facebook’s increased presence in your life in exchange for very useful goods and services. They’ll walk your dog for you and organize three-legged races. You will have an affordable rental within walking distance of your workplace and all you have to give them is the promise of further investment in the Silicon Valley ideology of cyber-libertarianism. You will knowingly and willingly consent to this ideology not only because you get useful things (that are scarce because the company made them that way) but because it seems as though this ideology is the only thing that works. The company is its own theater of proof [paywalled PDF] that showcases the effectiveness of its own ideology. Your employer wouldn’t be able to afford a world-class sushi chef every Thursday if they weren’t doing something right. This is the ultimate melding of work and play- your play appears to you as the product of (not just the reward for) good work. You don’t just enjoy your job; you enjoy the world your job creates. Your work is like silicon: everywhere and yet never completely recognizable.

David is on Twitter and Tumblr.

 

YouTube Preview Image