Facebook

Would if this were true?
Would if this were true?

The Facebook newsfeed is the subject of a lot of criticism, and rightly so. Not only does it impose an echo chamber on your digitally-mediated existence, the company constantly tries to convince users that it is user behavior –not their secret algorithm—that creates our personalized spin zones. But then there are moments when, for one reason or another, someone comes across your newsfeed that says something super racist or misogynistic and you have to decide to respond or not. If you do, and maybe get into a little back-and-forth, Facebook does a weird thing: that person starts showing up in your newsfeed a lot more.

This happened to me recently and it has me thinking about the role of the Facebook newsfeed in inter-personal instantiations of systematic oppression. Facebook’s newsfeed, specially formulated to increase engagement by presenting the user with content that they have engaged with in the past, is at once encouraging of white allyship against oppression and inflicting a kind of violence on women and people of color. The same algorithmic action can produce both consequences depending on the user. more...

SCOTUS

What counts as a threat via social media, and what are the legal implications? The Supreme Court just made a ruling on this subject, deciding 8-1 that content alone is not sufficient evidence. Most accounts of this ruling frame it as “raising the bar” for a legal conviction of threatening behavior via social media. I argue instead that the bar has not been raised, but made less fixed, and rightly so.

At issue was an earlier conviction and jail sentence for Anthony Elonis, whose Facebook posts projected harm onto his ex-wife, an FBI agent, and school children. more...

a

The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests structure them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific humans. But so much of the rhetoric around code, “big” data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in structuring what happens. The greatest success of “big” data so far has been for those with that data to sell their interests as neutral.

Today, Facebook researchers released a report in Science on the flow of ideological news content on their site. “Exposure to ideologically diverse news and opinion on Facebook” by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called “filter bubble”, seeing only what one wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like Facebook’s director of news recently ignored the company’s journalistic role shaping our news ecosystem, Facebook’s researchers make this paper about minimizing their role in structuring what a user sees and posts. I’ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project describing contemporary data science as a sort of neo-positivism. I’d like to put some of my thoughts connecting it all here.

more...

Affordances

There’s a tricky balancing act to play when thinking about the relative influence of technological artifacts and the humans who create and use these artifacts. It’s all too easy to blame technologies or alternatively, discount their shaping effects.

Both Marshall McLuhan and Actor Network Theorists (ANT) insist on the efficaciousness of technological objects. These objects do things, and as researchers, we should take those things seriously. In response to the popular adage that “guns don’t kill people, people kill people,” ANT scholar Bruno Latour famously retorts:

It is neither people nor guns that kill. Responsibility for action must be shared among the various actants

From this perspective, failing to take seriously the active role of technological artifacts, assuming instead that everything hinges on human practice, is to risk entrapment by those artifacts that push us in ways we cannot understand or recognize. Speaking of media technologies, McLuhan warns:

Subliminal and docile acceptance of media impact has made them prisons without walls for their human users.   

This, they get right. Technology is not merely a tool of human agency, but pushes, guides, and sometimes traps users in significant ways. And yet both McLuhan and ANT have been justly criticized as deterministic. Technologies may shape those who use them, but humans created these artifacts, and humans can—and do— work around them.   more...

A recent New York Times opinion piece by Hannah Seligson has declared “the unhappy marriage” to be “Facebook’s last taboo.”  As a scholar of Facebook, I found the singling out of marriage rather odd. For years now, critics have been decrying the general lack of unhappy anything on Facebook, arguing that the level of self-monitoring typical on the site strips it of authenticity and relational value. While it’s true that most people try to limit the amount of negativity they display on Facebook (as in any semi-public social space), and the interface itself privileges good news, Facebook users are leveraging the medium specifically for the delivery of “bad” (or uncertain) news.

That Facebook is a semi-public space with most, if not all, social norms for public spaces carrying over from face-to-face interaction is now a commonly accepted definition of the platform.  In fact, Seligson touches on this a number of times, comparing Facebook, for instance, to cocktail parties for which the married couple hosting must put aside private squabbles and present a united front.  That the space is now theoretically visible to hundreds or even thousands of Facebook “friends” certainly reflects a change of scale. more...

ello

2014 Ello was in with the new and by 2015 it became out with the old. It’s New Years Eve and I want to look back on a thing that came and went this year, which leaves me feeling bummed. You can only be really disappointed if you start with high hopes, and lots of people for lots of reasons wanted Ello to work. It became quickly clear that the site didn’t have a strong vision. Neither its politics or its understanding of the social life it set out to mediate were inspired or clever enough to be compelling.

more...

happy

The end of a year is an introspective time. We reflect on the past 365 days and lay plans for the year to come. This is a time of remembering, analyzing, hoping, and figuring. Helping us through this introspective process is Facebook’s Year in Review.  This app compiles the “highlights” of each user’s year through images, events, and status updates. It then displays this compilation for the user, and gives the option to share the review with Friends.  The default caption reads: “It’s been a great year! Thanks for being part of it[i].”

Quickly, the app garnered negative attention when web designer Eric Meyer blogged about his heart wrenching experience of facing pictures of his 6 year old daughter who passed away not long ago. There was no trigger warning. There was no opt-in. There was simply an up-beat video picturing his daughter’s face when he logged into his Facebook account. He aptly attributes this experience to “inadvertent algorithmic cruelty.”

Although the cruelty was indeed inadvertent, it was none-the-less inevitable. It reflects a larger issue with the Facebook platform: its insistent structure of compulsory happiness. This insistence is reflected in a “Like” button, without any other 1-click emotive options; it is reflected in goofy emoticons through which sadness and illness are expressed with cartoon-like faces in cheerful colors; it is reflected in relationship status changes that announce themselves to one’s network. And as users, we largely comply. We share the happy moments, the funny quips, the accomplishments and #humblebrags, while hiding, ignoring, or unFriending those with the audacity to mope; to clog our newsfeeds with negativity. But we do not comply ubiquitously nor condone/censure unanimously. Sometimes we perform sadness, and sometimes we support each other in this. more...

Facebook remembers

Facebook announced this week that it will add a new search feature to the platform. This search feature will, for the first time, allow users to type in keywords and bring up specific network content. Previously, keyword searches lead to pages and advertisements.  Now, it will bring up images and text from users’ News Feeds. Although search results currently include only content shared with users by their Friends, I imagine including public posts in the results will be a forthcoming next step.

Facebook, as a documentation-heavy platform, has always affected both how we remember, and how we perform. It is the keeper of our photo albums, events attended, locations visited, and connections established, maintained, and broken. It recasts our history into linear stories, solidifying that which we share into the truest version of ourselves. And of course, the new search feature amplifies this, stripping users of the privacy-by-obscurity that tempered (though certainly did not eliminate) the effects of recorded and documented lives.

The search feature also does something interesting and new. It aggregates. For the first time, users can take the temperature of their networks on any variety of topics. Music, movies, news events and recipes can be called up, unburied from the content rubble and grouped in a systematic way.

Perhaps because I’ve been able to think of little else lately, I immediately considered what this new feature means for how we will remember the events of Ferguson, Staten Island, and the parade of police violence against young men of color. And relatedly, I considered how we will remember ourselves and each other in regard to these events. more...

ProcessProsumption is something of a buzzword here at Cyborgology. It refers to the blurring of production and consumption, such that consumers are entwined in the production process. Identity prosumption is a spin-off of this concept, and refers to the ways prosumptive activities act back upon the prosuming self. Identity prosumption is a neat and simple analytic tool, particularly useful in explaining the relationship between social media users and the content they create and share.

If you’ll stick with me through some geekery, I would like to think through some of the nuances of this humble bit of theory.   more...

A Budnitz Bike in its natural habitat.
A Budnitz Bike in its natural habitat. Source.

Paul Budnitz describes himself as a “serial entrepreneur” having created other companies that make artisanal toys and luxury bicycles. This is not the typical road bike most people have. He’s also the creator/founder/president/charismatic leader of Ello. And when a social network launches with a manifesto that proudly proclaims “You are not a product”, there’s more on the line than embedded video support. Despite the radical overtures of the initial launch, we shouldn’t expect any more from Ello than we would from a luxury bicycle. more...