a

The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests structure them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific humans. But so much of the rhetoric around code, “big” data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in structuring what happens. The greatest success of “big” data so far has been for those with that data to sell their interests as neutral.

Today, Facebook researchers released a report in Science on the flow of ideological news content on their site. “Exposure to ideologically diverse news and opinion on Facebook” by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called “filter bubble”, seeing only what one wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like Facebook’s director of news recently ignored the company’s journalistic role shaping our news ecosystem, Facebook’s researchers make this paper about minimizing their role in structuring what a user sees and posts. I’ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project describing contemporary data science as a sort of neo-positivism. I’d like to put some of my thoughts connecting it all here.

 

The Study Itself (method wonkery, skip to the next section if that sounds boring)

Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in a separate supporting materials appendix but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don’t fall for the Big-N trick, we don’t how this 9% is different from Facebook in general. We cannot treat this as a sample of “Facebook users” or even “Facebook liberals and conservatives”, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn’t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don’t explicitly list their political orientation on the site. Facebook’s report talks about Facebook users, which isn’t accurate. All the findings should be understood to be about Facebook users who also put their political orientation on their profiles, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways. The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.

So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is politically cross cutting, that is, shared by someone on the right and then seen by someone on the left and vice versa. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they’d already agree with as opposed to more diverse and challenging “cross cutting” information.

The Facebook researchers looked at how much, specifically, the newsfeed algorithm promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.

Facebook published this finding, that the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn’t there, ultimately because Facebook wants to make the case that their algorithm isn’t as big a factor in this political confirmation bias as people’s individual choices, stating, “individual choice has a larger role in limiting exposure to ideologically cross cutting content.” The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.

The report concludes that, “we conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms”. Nooo this just simply isn’t the case.

First, and most obvious, and please tell me if I am missing something here because it seems so obvious, this statement only holds true for the conservatives (17% less by choice, 5% by algorithm). The reduction in ideologically cross cutting content from the algorithm is greater than individual choice for liberals (6% by choice, 8% by algorithm). Second, to pick up on my annoyance above, note how they didn’t say this was true for the rare people who explicitly profile-self-identify, but for the whole context of Facebook. That’s misleading.

But even bracketing both of those issues, the next problem is that individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend. We didn’t need this report to confirm that. As if anyone critiquing how Facebook structures our information flows ever strawpersoned themselves into saying individual choice wasn’t important too. What’s important is the finding that, in addition to confirmation biasy individuals, the Facebook newsfeed algorithm exacerbates and furthers this filter-bubble bias over and above the baseline.

 

Fair And Balanced

“Fair and Balanced” is of course the famous Fox News punch line, which also stands for the fallacy of being politically disinterested and how those structuring what we see have an interest in pretending they are neutral and fair, objective and balanced. The joke is that the myth of being politically disinterested is a very familiar and powerful interest.

These lines from the report are nagging me, “we conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms”, which is more than just incorrect and it is more than just journalist bait but is also indicative of something larger. Also, “the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.” This is of the same politics and logic as “guns don’t kill people, people kill people”, the same fallacy that technologies are neutral, that they are “just tools”, completely neglecting, for example, how different people appear killable with a gun in hand. This fallacy of neutrality is an ideological stance: even against their findings, Facebook wants to downplay the role of their own sorting algorithms. They want to sell their algorithmic structure as impartial, that by simply giving people what they want the algorithmically-sorted information is the work of users and not the site itself. The Facebook researchers describe their algorithm as such,

The order in which users see stories in the News Feed depends on many factors, including how often the viewer visits Facebook, how much they interact with certain friends, and how often users have clicked on links to certain websites in News Feed in the past

What isn’t mentioned is how a post that is “liked” more is more likely to perform well in the algorithm. Also left out of this description is that the newsfeed is also sorted based on what people are willing to pay. The order of Facebook’s newsfeed is partly for sale. It’s almost their entire business model, and a model that relates directly to the variable they are describing. In the appendix notes, the Facebook researchers state that,

Some positions—particularly the second position of the News Feed—are often allocated to sponsored content, which may include links to articles shared by friends which are associated with websites associated with a particular advertiser. Since we aim to characterize interactions with all hard content shared by friends, such links are included in our analyses. These links appear to be more ideologically consistent with the viewers; however further investigation is beyond the scope of this work.

It seems like sponsored content might very well be furthering the so-called filter bubble, but the researchers took this out of the scope of the study, okay, but that this sponsored content was not even included in how the algorithm works in the report itself is suspect.**

Further, this whole business of conceptually separating the influence of the algorithm versus individual choices willfully misunderstands what algorithms are and what they do. Algorithms are made to capture, analyze, and re-adjust individual behavior in ways that serve particular ends. Individual choice is partly a result of how the algorithm teaches us, and the algorithm itself is dynamic code that reacts to and changes with individual choice. Neither the algorithm or individual choice can be understood without the other.

For example, that the newsfeed algorithm suppresses ideologically cross cutting news to a non-trivial degree teaches individuals to not share as much cross cutting news. By making the newsfeed an algorithm, Facebook enters users into a competition to be seen. If you don’t get “likes” and attention with what you share, your content will subsequently be seen even less, and thus you and your voice and presence is lessened. To post without likes means few are seeing your post, so there is little point in posting. We want likes because we want to be seen. We see what gets likes and adjust accordingly. Each like we give, receive, or even see very subtly and incrementally acts as a sort of social training, each a tiny cut that carves deeply in aggregate. This is just one way the Facebook algorithm influences the individual choices we make, to post, click, click with the intention of reposting, and so on. And it is no coincidence that when Facebook described their algorithm in this report, they left out the biggest ways Facebook itself makes decisions that shape what we see:the position in the newsfeed something that can be bought or also having a “like” button and ranking content on it (no dislike, no important, etc, which would all change our individual choices).

To ignore these ways the site is structured and to instead be seen as a neutral platform means to not have responsibility, to offload the blame for what users see or don’t see onto on the users. The politics and motives that go into structuring the site and therefore its users don’t have to be questioned if they are not acknowledged. This ideological push by Facebook to downplay their own role in shaping their own site was also on display last month at the International Journalism Festival in Italy, featuring Facebook’s head of news. You can watch him evade any role in structuring the news ecosystem. NYU journalism professor Jay Rosen summarizes Facebook’s message,

1. “It’s not that we control NewsFeed, you control NewsFeed by what you tell us that you’re interested in.”

2. Facebook should not be anyone’s primary news source or experience. It should be a supplement to seeking out news yourself with direct suppliers. “Complementary” was the word he used several times.

3. Facebook is accountable to its users for creating a great experience. That describes the kind of accountability it has. End of story.

Rosen correctly states, “It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation.”

Facebook orders and ranks news information, which is doing the work of journalism, but they refuse to acknowledge they are doing the work of journalism. Facebook cannot take its own role in news seriously, and they cannot take journalism itself seriously, if they are unwilling to admit the degree to which they shape how news appears on the site. The most dangerous journalism is journalism that doesn’t see itself as such.***

Facebook’s line, “it’s not that we control NewsFeed, you control NewsFeed,” exactly parallels the ideological stance that informs the Facebook researchers attempt to downplay Facebook’s own role in sorting what people see. Coincidence? This erasing of their role in the structuring of personal, social, and civic life is being repeated in full force by the company, seemingly like when politicians are given a party line to repeat on Sunday news shows.****

Power and control are most efficiently maintained when they are made invisible. Facebook’s ideological push to dismiss the very real ways they structure what users see and do is the company attempting to simultaneously embrace control and evade responsibility. Their news team doesn’t need to be competent in journalism because they don’t see themselves as doing journalism. But Facebook is doing journalism, and the way they code their algorithms and the rest of the site is structuring and shaping personal, social, and civic life. Like it or not, we’ve collectively handed very important civic roles to social media companies, the one I work for included, and a real danger is that we can’t hope for them to be competent at these jobs when they wont even admit to doing them.

Nathan is on Twitter and Tumblr

 

 

 

 

 

*Note here that the Facebook researchers could have easily avoided this problem by inferring politically orientation from what any users posts instead of only looking at those who state it explicitly. This would have resulted a stronger research design, but also very bad press, probably something like “Facebook Is Trying to Swing the Election!”, which also may not be wrong. Facebook went for the less invasive measure here, but, and this is my guess nothing more, they likely ran and haven’t published a version of this study with nearly all users by inferring political orientation, which is not difficult to do.  

**I also work for a social media company (Snapchat), I understand the conflicts involved here, but there’s no reason that such an important variable in how the newsfeed is sorted should be left out of the report about the consequences of such sorting

***Some may not agree that what the newsfeed algorithm does is “journalism”. I think calling it journalism is provocative and ultimately correct. First, I don’t mean they are on the ground reporting, that is only one part of the work of journalism, and Facebook’s role is another part. If you agree with me that any algorithm, including this one, is never neutral, disinterested, and objective but instead built by humans with politics, interests, and insecurities, then this type of sorting of news information is certainly the work of journalism. Editing, curating, sorting, and ranking of news information are all part of journalism and Facebook is making these decisions that influence how news is being produced, displayed, and most importantly ranked, that is, what is seen and what isn’t. Sorting a NEWSfeed is doing the work of journalism.

***Of course, I don’t know if this is stated explicitly for employees to repeat, or if it is just this deep in the company’s ideology, and I don’t know which would be more troubling

Further reading: Some reactions from other researchers. Good to know I wasn’t alone in thinking this was very poor work.

Zeynep Tufekci, How Facebook’s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks

Eszter Hargittai, Why doesn’t Science publish important methods info prominently?

Christian Sandvig, The Facebook “It’s Not Our Fault” Study