LeeAt this point everyone is undoubtedly aware of the school shooting at Umpqua Community College in Oregon, though I am certain the method by which we came across the news varied. Some of us were notified of the event by friends with fingers closer to the pulse, and still more of us learned about it first-hand through a variety of content aggregation platforms, such as Reddit.

I would hazard that the majority of people learned about it first and foremost through social media;  primarily Facebook or Twitter. I certainly did. A friend first told me about it through Facebook Messenger, and almost immediately after she did, I started to see articles trickling into my newsfeed in the background of my Messenger window. And the moment that happened I shut my Facebook tab down, despite the fact that I compulsively, addictively, have it open almost all the time.

Facebook, when taken on the whole, is a fantastic way for people to compare each others’ lives and share pictures of kittens and children, but when it comes to a tragedy, the platform is woefully inadequate at allowing its users to parse and process the gravity of events as they unfold. It is a thorough shock to a system that thrives on irreverent links and topical memes, and when faced with an item that requires genuine reflection and thought, it is often simpler – indeed, even more beneficial – for users to turn their heads until the kittens may resume. more...

TargetHeadlineDisclaimer: Nothing I say in this post is new or theoretically novel. The story to which I’ll refer already peaked over the weekend, and what I have to say about it–that trolling is sometimes productive– is a point well made by many others (like on this blog last month by Nathan Jurgenson). But seriously, can we all please just take a moment and bask in appreciation of trolling at its best?

For those who missed it, Target recently announced that they would do away with gender designations for kids toys and bedding. The retailer’s move toward gender neutrality, unsurprisingly, drew ire from bigoted jerks who apparently fear that mixing dolls with trucks will hasten the unraveling of American society (if David Banks can give himself one more calls it as I sees it moment, I can too).

Sensing “comedy gold” Mike Melgaard went to Target’s Facebook page. He quickly created a fake Facebook account under the name “Ask ForHelp” with a red bullseye as the profile picture. Using this account to pose as the voice of Target’s customer service, he then proceeded to respond with sarcastic mockery to customer complaints. And hit gold, Mike did!! For 16 hilarious hours transphobic commenters provided a rich well of comedic fodder. Ultimately, Facebook stopped the fun by removing Melgaard’s Ask ForHelp account. Although Target never officially endorsed Melgaard, they made their support clear in this Facebook post on Thursday evening:  more...

Would if this were true?
Would if this were true?

The Facebook newsfeed is the subject of a lot of criticism, and rightly so. Not only does it impose an echo chamber on your digitally-mediated existence, the company constantly tries to convince users that it is user behavior –not their secret algorithm—that creates our personalized spin zones. But then there are moments when, for one reason or another, someone comes across your newsfeed that says something super racist or misogynistic and you have to decide to respond or not. If you do, and maybe get into a little back-and-forth, Facebook does a weird thing: that person starts showing up in your newsfeed a lot more.

This happened to me recently and it has me thinking about the role of the Facebook newsfeed in inter-personal instantiations of systematic oppression. Facebook’s newsfeed, specially formulated to increase engagement by presenting the user with content that they have engaged with in the past, is at once encouraging of white allyship against oppression and inflicting a kind of violence on women and people of color. The same algorithmic action can produce both consequences depending on the user. more...


What counts as a threat via social media, and what are the legal implications? The Supreme Court just made a ruling on this subject, deciding 8-1 that content alone is not sufficient evidence. Most accounts of this ruling frame it as “raising the bar” for a legal conviction of threatening behavior via social media. I argue instead that the bar has not been raised, but made less fixed, and rightly so.

At issue was an earlier conviction and jail sentence for Anthony Elonis, whose Facebook posts projected harm onto his ex-wife, an FBI agent, and school children. more...


The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests structure them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific humans. But so much of the rhetoric around code, “big” data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in structuring what happens. The greatest success of “big” data so far has been for those with that data to sell their interests as neutral.

Today, Facebook researchers released a report in Science on the flow of ideological news content on their site. “Exposure to ideologically diverse news and opinion on Facebook” by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called “filter bubble”, seeing only what one wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like Facebook’s director of news recently ignored the company’s journalistic role shaping our news ecosystem, Facebook’s researchers make this paper about minimizing their role in structuring what a user sees and posts. I’ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project describing contemporary data science as a sort of neo-positivism. I’d like to put some of my thoughts connecting it all here.



There’s a tricky balancing act to play when thinking about the relative influence of technological artifacts and the humans who create and use these artifacts. It’s all too easy to blame technologies or alternatively, discount their shaping effects.

Both Marshall McLuhan and Actor Network Theorists (ANT) insist on the efficaciousness of technological objects. These objects do things, and as researchers, we should take those things seriously. In response to the popular adage that “guns don’t kill people, people kill people,” ANT scholar Bruno Latour famously retorts:

It is neither people nor guns that kill. Responsibility for action must be shared among the various actants

From this perspective, failing to take seriously the active role of technological artifacts, assuming instead that everything hinges on human practice, is to risk entrapment by those artifacts that push us in ways we cannot understand or recognize. Speaking of media technologies, McLuhan warns:

Subliminal and docile acceptance of media impact has made them prisons without walls for their human users.   

This, they get right. Technology is not merely a tool of human agency, but pushes, guides, and sometimes traps users in significant ways. And yet both McLuhan and ANT have been justly criticized as deterministic. Technologies may shape those who use them, but humans created these artifacts, and humans can—and do— work around them.   more...

A recent New York Times opinion piece by Hannah Seligson has declared “the unhappy marriage” to be “Facebook’s last taboo.”  As a scholar of Facebook, I found the singling out of marriage rather odd. For years now, critics have been decrying the general lack of unhappy anything on Facebook, arguing that the level of self-monitoring typical on the site strips it of authenticity and relational value. While it’s true that most people try to limit the amount of negativity they display on Facebook (as in any semi-public social space), and the interface itself privileges good news, Facebook users are leveraging the medium specifically for the delivery of “bad” (or uncertain) news.

That Facebook is a semi-public space with most, if not all, social norms for public spaces carrying over from face-to-face interaction is now a commonly accepted definition of the platform.  In fact, Seligson touches on this a number of times, comparing Facebook, for instance, to cocktail parties for which the married couple hosting must put aside private squabbles and present a united front.  That the space is now theoretically visible to hundreds or even thousands of Facebook “friends” certainly reflects a change of scale. more...


2014 Ello was in with the new and by 2015 it became out with the old. It’s New Years Eve and I want to look back on a thing that came and went this year, which leaves me feeling bummed. You can only be really disappointed if you start with high hopes, and lots of people for lots of reasons wanted Ello to work. It became quickly clear that the site didn’t have a strong vision. Neither its politics or its understanding of the social life it set out to mediate were inspired or clever enough to be compelling.



The end of a year is an introspective time. We reflect on the past 365 days and lay plans for the year to come. This is a time of remembering, analyzing, hoping, and figuring. Helping us through this introspective process is Facebook’s Year in Review.  This app compiles the “highlights” of each user’s year through images, events, and status updates. It then displays this compilation for the user, and gives the option to share the review with Friends.  The default caption reads: “It’s been a great year! Thanks for being part of it[i].”

Quickly, the app garnered negative attention when web designer Eric Meyer blogged about his heart wrenching experience of facing pictures of his 6 year old daughter who passed away not long ago. There was no trigger warning. There was no opt-in. There was simply an up-beat video picturing his daughter’s face when he logged into his Facebook account. He aptly attributes this experience to “inadvertent algorithmic cruelty.”

Although the cruelty was indeed inadvertent, it was none-the-less inevitable. It reflects a larger issue with the Facebook platform: its insistent structure of compulsory happiness. This insistence is reflected in a “Like” button, without any other 1-click emotive options; it is reflected in goofy emoticons through which sadness and illness are expressed with cartoon-like faces in cheerful colors; it is reflected in relationship status changes that announce themselves to one’s network. And as users, we largely comply. We share the happy moments, the funny quips, the accomplishments and #humblebrags, while hiding, ignoring, or unFriending those with the audacity to mope; to clog our newsfeeds with negativity. But we do not comply ubiquitously nor condone/censure unanimously. Sometimes we perform sadness, and sometimes we support each other in this. more...

Facebook remembers

Facebook announced this week that it will add a new search feature to the platform. This search feature will, for the first time, allow users to type in keywords and bring up specific network content. Previously, keyword searches lead to pages and advertisements.  Now, it will bring up images and text from users’ News Feeds. Although search results currently include only content shared with users by their Friends, I imagine including public posts in the results will be a forthcoming next step.

Facebook, as a documentation-heavy platform, has always affected both how we remember, and how we perform. It is the keeper of our photo albums, events attended, locations visited, and connections established, maintained, and broken. It recasts our history into linear stories, solidifying that which we share into the truest version of ourselves. And of course, the new search feature amplifies this, stripping users of the privacy-by-obscurity that tempered (though certainly did not eliminate) the effects of recorded and documented lives.

The search feature also does something interesting and new. It aggregates. For the first time, users can take the temperature of their networks on any variety of topics. Music, movies, news events and recipes can be called up, unburied from the content rubble and grouped in a systematic way.

Perhaps because I’ve been able to think of little else lately, I immediately considered what this new feature means for how we will remember the events of Ferguson, Staten Island, and the parade of police violence against young men of color. And relatedly, I considered how we will remember ourselves and each other in regard to these events. more...