The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests structure them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific humans. But so much of the rhetoric around code, “big” data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in structuring what happens. The greatest success of “big” data so far has been for those with that data to sell their interests as neutral.
Today, Facebook researchers released a report in Science on the flow of ideological news content on their site. “Exposure to ideologically diverse news and opinion on Facebook” by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called “filter bubble”, seeing only what one wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like Facebook’s director of news recently ignored the company’s journalistic role shaping our news ecosystem, Facebook’s researchers make this paper about minimizing their role in structuring what a user sees and posts. I’ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project describing contemporary data science as a sort of neo-positivism. I’d like to put some of my thoughts connecting it all here.
The Study Itself (method wonkery, skip to the next section if that sounds boring)
Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in a separate supporting materials appendix but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don’t fall for the Big-N trick, we don’t how this 9% is different from Facebook in general. We cannot treat this as a sample of “Facebook users” or even “Facebook liberals and conservatives”, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn’t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don’t explicitly list their political orientation on the site. Facebook’s report talks about Facebook users, which isn’t accurate. All the findings should be understood to be about Facebook users who also put their political orientation on their profiles, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways. The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.
So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is politically cross cutting, that is, shared by someone on the right and then seen by someone on the left and vice versa. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they’d already agree with as opposed to more diverse and challenging “cross cutting” information.
The Facebook researchers looked at how much, specifically, the newsfeed algorithm promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.
Facebook published this finding, that the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn’t there, ultimately because Facebook wants to make the case that their algorithm isn’t as big a factor in this political confirmation bias as people’s individual choices, stating, “individual choice has a larger role in limiting exposure to ideologically cross cutting content.” The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.
The report concludes that, “we conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms”. Nooo this just simply isn’t the case.
First, and most obvious, and please tell me if I am missing something here because it seems so obvious, this statement only holds true for the conservatives (17% less by choice, 5% by algorithm). The reduction in ideologically cross cutting content from the algorithm is greater than individual choice for liberals (6% by choice, 8% by algorithm). Second, to pick up on my annoyance above, note how they didn’t say this was true for the rare people who explicitly profile-self-identify, but for the whole context of Facebook. That’s misleading.
But even bracketing both of those issues, the next problem is that individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend. We didn’t need this report to confirm that. As if anyone critiquing how Facebook structures our information flows ever strawpersoned themselves into saying individual choice wasn’t important too. What’s important is the finding that, in addition to confirmation biasy individuals, the Facebook newsfeed algorithm exacerbates and furthers this filter-bubble bias over and above the baseline.
Fair And Balanced
“Fair and Balanced” is of course the famous Fox News punch line, which also stands for the fallacy of being politically disinterested and how those structuring what we see have an interest in pretending they are neutral and fair, objective and balanced. The joke is that the myth of being politically disinterested is a very familiar and powerful interest.
These lines from the report are nagging me, “we conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms”, which is more than just incorrect and it is more than just journalist bait but is also indicative of something larger. Also, “the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.” This is of the same politics and logic as “guns don’t kill people, people kill people”, the same fallacy that technologies are neutral, that they are “just tools”, completely neglecting, for example, how different people appear killable with a gun in hand. This fallacy of neutrality is an ideological stance: even against their findings, Facebook wants to downplay the role of their own sorting algorithms. They want to sell their algorithmic structure as impartial, that by simply giving people what they want the algorithmically-sorted information is the work of users and not the site itself. The Facebook researchers describe their algorithm as such,
The order in which users see stories in the News Feed depends on many factors, including how often the viewer visits Facebook, how much they interact with certain friends, and how often users have clicked on links to certain websites in News Feed in the past
What isn’t mentioned is how a post that is “liked” more is more likely to perform well in the algorithm. Also left out of this description is that the newsfeed is also sorted based on what people are willing to pay. The order of Facebook’s newsfeed is partly for sale. It’s almost their entire business model, and a model that relates directly to the variable they are describing. In the appendix notes, the Facebook researchers state that,
Some positions—particularly the second position of the News Feed—are often allocated to sponsored content, which may include links to articles shared by friends which are associated with websites associated with a particular advertiser. Since we aim to characterize interactions with all hard content shared by friends, such links are included in our analyses. These links appear to be more ideologically consistent with the viewers; however further investigation is beyond the scope of this work.
It seems like sponsored content might very well be furthering the so-called filter bubble, but the researchers took this out of the scope of the study, okay, but that this sponsored content was not even included in how the algorithm works in the report itself is suspect.**
Further, this whole business of conceptually separating the influence of the algorithm versus individual choices willfully misunderstands what algorithms are and what they do. Algorithms are made to capture, analyze, and re-adjust individual behavior in ways that serve particular ends. Individual choice is partly a result of how the algorithm teaches us, and the algorithm itself is dynamic code that reacts to and changes with individual choice. Neither the algorithm or individual choice can be understood without the other.
For example, that the newsfeed algorithm suppresses ideologically cross cutting news to a non-trivial degree teaches individuals to not share as much cross cutting news. By making the newsfeed an algorithm, Facebook enters users into a competition to be seen. If you don’t get “likes” and attention with what you share, your content will subsequently be seen even less, and thus you and your voice and presence is lessened. To post without likes means few are seeing your post, so there is little point in posting. We want likes because we want to be seen. We see what gets likes and adjust accordingly. Each like we give, receive, or even see very subtly and incrementally acts as a sort of social training, each a tiny cut that carves deeply in aggregate. This is just one way the Facebook algorithm influences the individual choices we make, to post, click, click with the intention of reposting, and so on. And it is no coincidence that when Facebook described their algorithm in this report, they left out the biggest ways Facebook itself makes decisions that shape what we see:the position in the newsfeed something that can be bought or also having a “like” button and ranking content on it (no dislike, no important, etc, which would all change our individual choices).
To ignore these ways the site is structured and to instead be seen as a neutral platform means to not have responsibility, to offload the blame for what users see or don’t see onto on the users. The politics and motives that go into structuring the site and therefore its users don’t have to be questioned if they are not acknowledged. This ideological push by Facebook to downplay their own role in shaping their own site was also on display last month at the International Journalism Festival in Italy, featuring Facebook’s head of news. You can watch him evade any role in structuring the news ecosystem. NYU journalism professor Jay Rosen summarizes Facebook’s message,
1. “It’s not that we control NewsFeed, you control NewsFeed by what you tell us that you’re interested in.”
2. Facebook should not be anyone’s primary news source or experience. It should be a supplement to seeking out news yourself with direct suppliers. “Complementary” was the word he used several times.
3. Facebook is accountable to its users for creating a great experience. That describes the kind of accountability it has. End of story.
Rosen correctly states, “It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation.”
Facebook orders and ranks news information, which is doing the work of journalism, but they refuse to acknowledge they are doing the work of journalism. Facebook cannot take its own role in news seriously, and they cannot take journalism itself seriously, if they are unwilling to admit the degree to which they shape how news appears on the site. The most dangerous journalism is journalism that doesn’t see itself as such.***
Facebook’s line, “it’s not that we control NewsFeed, you control NewsFeed,” exactly parallels the ideological stance that informs the Facebook researchers attempt to downplay Facebook’s own role in sorting what people see. Coincidence? This erasing of their role in the structuring of personal, social, and civic life is being repeated in full force by the company, seemingly like when politicians are given a party line to repeat on Sunday news shows.****
Power and control are most efficiently maintained when they are made invisible. Facebook’s ideological push to dismiss the very real ways they structure what users see and do is the company attempting to simultaneously embrace control and evade responsibility. Their news team doesn’t need to be competent in journalism because they don’t see themselves as doing journalism. But Facebook is doing journalism, and the way they code their algorithms and the rest of the site is structuring and shaping personal, social, and civic life. Like it or not, we’ve collectively handed very important civic roles to social media companies, the one I work for included, and a real danger is that we can’t hope for them to be competent at these jobs when they wont even admit to doing them.
Nathan is on Twitter and Tumblr
*Note here that the Facebook researchers could have easily avoided this problem by inferring politically orientation from what any users posts instead of only looking at those who state it explicitly. This would have resulted a stronger research design, but also very bad press, probably something like “Facebook Is Trying to Swing the Election!”, which also may not be wrong. Facebook went for the less invasive measure here, but, and this is my guess nothing more, they likely ran and haven’t published a version of this study with nearly all users by inferring political orientation, which is not difficult to do.
**I also work for a social media company (Snapchat), I understand the conflicts involved here, but there’s no reason that such an important variable in how the newsfeed is sorted should be left out of the report about the consequences of such sorting
***Some may not agree that what the newsfeed algorithm does is “journalism”. I think calling it journalism is provocative and ultimately correct. First, I don’t mean they are on the ground reporting, that is only one part of the work of journalism, and Facebook’s role is another part. If you agree with me that any algorithm, including this one, is never neutral, disinterested, and objective but instead built by humans with politics, interests, and insecurities, then this type of sorting of news information is certainly the work of journalism. Editing, curating, sorting, and ranking of news information are all part of journalism and Facebook is making these decisions that influence how news is being produced, displayed, and most importantly ranked, that is, what is seen and what isn’t. Sorting a NEWSfeed is doing the work of journalism.
***Of course, I don’t know if this is stated explicitly for employees to repeat, or if it is just this deep in the company’s ideology, and I don’t know which would be more troubling
Further reading: Some reactions from other researchers. Good to know I wasn’t alone in thinking this was very poor work.
Eszter Hargittai, Why doesn’t Science publish important methods info prominently?
Christian Sandvig, The Facebook “It’s Not Our Fault” Study
Comments 46
SAA — May 7, 2015
Great piece.
You wrote at the very very end,
"Note here that the Facebook researchers could have easily avoided this problem by inferring politically orientation from what any users posts instead of only looking at those who state it explicitly"
This "easily" would involve qualitative research, which is not "easy" for FB's seemingly preferred research method of algorithms to do (qual better done by humans at the moment) and is more expensive because it takes more time. Plus, qualitative research isn't computer generated which is sort of a problem for many in the valley these days, because the value is on quant and not qual and has been for decades as far as I can tell.
Looking for keywords is cheap and quick and much much easier to code. This I think contributes to a broader problem with Big Data and algorithmic based research in general.
Aaron Farber — May 7, 2015
I agree with poster above. Just like in school, the footnotes are the things most worth remembering.
The Facebook “It’s Not Our Fault” Study | Social Media Collective — May 7, 2015
[…] Jurgenson from Maryland and Snapchat wrote on Cyborgology (“in a fury“) that Facebook is intentionally “evading” its own role in the […]
multicast » Blog Archive » The Facebook “It’s Not Our Fault” Study — May 7, 2015
[…] Jurgenson from Maryland and Snapchat wrote on Cyborgology (“in a fury“) that Facebook is intentionally “evading” its own role in the […]
Comradde PhysioProffe — May 7, 2015
As a scientist, I have a big problem with a legitimate peer-reviewed scientific journal publishing a paper authored by employees of a corporation that describes their analysis of the outcomes of a secret proprietary algorithm controlled by that corporation as if the algorithm is some static, knowable law of nature. When the authors refer to "algorithms", they are basically lying. Because the day this paper was published, Facebook could have completely changed all of their news feed algorithms, and rendered this paper completely moot.
As scientists, of course we are comfortable dealing with unknowns and even the unknowable as we pursue understanding of complicated natural entities. But these "algorithms" are completely knowable, indeed are known to these researchers' employer, yet are treated as a black box that can only be studied indirectly by examining its outputs. This is disingenuous in the extreme, and turns the notion of "science" on its head.
I am disgusted at Science for publishing this non-scientific propaganda piece.
Facebook's algorithm study doesn't prove what the network says it does - Fortune — May 7, 2015
[…] Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than […]
Facebook “filter bubble” study raises more questions than it answers | Technology — May 7, 2015
[…] Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than […]
Marko Milosavljevic — May 8, 2015
There are many important aspects of this text. For my personally, one of the key statements is almost hidden - the third comment at the end, stating that "Editing, curating, sorting, and ranking of news information are all part of journalism and Facebook is making these decisions that influence how news is being produced, displayed, and most importantly ranked, that is, what is seen and what isn’t."
Karol Jakubowitz called these "media-like activities" and I talked about these activities, lack of transparency regarding these activities, and the unregulated power of new gatekeepers such as Facebook at recent EuroCPR conference in Brussels. Among the people present was also Policy Director of Facebook UK, Middle East and Africa; he presented similar arguments in his response as did the Facebook researchers in their paper: neutrality, "you" are in control etc.
However he mentioned one interesting thing: that Facebook shaped the algorithm in such a way that it shows you (out of 10 items shown) 8 items from your (real) friends and 2 items from other friends: media, companies etc. Interesting intervention / manipulation that is (at least I'm now aware of it) not publicly disclosed.
Critics Slam Facebook's 'Filter Bubble' Study - The Dessauer GroupThe Dessauer Group — May 8, 2015
[…] Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than […]
Facebook Dives Into the Echo Chamber | Science of News — May 8, 2015
[…] content.” To continue with the sports reference, this is where sociologists start throwing penalty flags. The interpretation found in the scholarly journal just happens to be the same […]
Friday Futures: Cities, Cars, Spying on Cars, Drones that Spy on us all | The Fine Print — May 8, 2015
[…] with the study. Read the Fine Print, says Eszter Hargittai, Christian Sandvig, Zeynep Tufekci, Cyborgology, Eli Pariser (author of ‘The Filter […]
Facebook Study Doesn’t Silence Filter-Bubble Criticism — May 8, 2015
[…] The Society Pages, he explained: "Individual users choosing news they agree with and Facebook's algorithm […]
Track des Tages: user18081971 – 14 Cornish Spreek5b [St. Nectan S Glen Waterfalls Mix] | Realvinylz — May 9, 2015
[…] My Tiny Daily Networking motherboard.vice.com, mic.com, thesocietypages.org, […]
Facebook research | Andrew Spittle — May 10, 2015
[…] Zeynep Tufekci wrote an overview of the study as well as links to many other good reflections. Nathan Jurgenson also wrote about what makes the study methodologically questionable and some of its other […]
Petervan’s Delicacies – Week 4 May 2015 | Petervan — May 10, 2015
[…] About Facebook playing the innocent game: http://thesocietypages.org/cyborgology/2015/05/07/facebook-fair-and-balanced/ […]
How Facebook’s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks (The Message) | Uma (in)certa antropologia — May 10, 2015
[…] as Christian Sandvig states in this post, and Nathan Jurgenson in this important post here, and David Lazer in the introduction to the piece in Science explore deeply, the Facebook […]
Facebook Research Critiques - event mechanics — May 11, 2015
[…] first round of critiques of this research (here, here, here and here) focuses on various aspects of the study, but all resonate with a key critical point (as […]
Petervan’s Delicacies – Week 4 May 2015 | Socially Build — May 11, 2015
[…] About Facebook playing the innocent game: http://thesocietypages.org/cyborgology/2015/05/07/facebook-fair-and-balanced/ […]
「怪我囉!」臉書研究撇清演算法造成言論濾泡挨轟 | TechNews 科技新報 — May 11, 2015
[…] Facebook: Fair and Balanced […]
Paper at Theorizing the Web: Synthetic Subjects — May 11, 2015
[…] see (and vice versa). Wired used the surprise UK election result to talk about the study. Commentators took issue to media misunderstanding and misrepresentation of the study. The […]
QUI CHOISIT RÉELLEMENT, FACEBOOK OU VOUS ? | — May 12, 2015
[…] La critique qui a immédiatement surgi pointe l’échantillon. Si 10 millions semblent beaucoup, c’est relativement peu à l’échelle de Facebook. Mais plus important encore, seulement 9% des usagers acceptent d’identifier leur option politique: sont-ils représentatifs des Facebookiens ? […]
Facebook: Fair and Balanced » Cyborgology | James Reads — May 12, 2015
[…] Facebook: Fair and Balanced » Cyborgology. […]
The Facebook “It’s Not Our Fault” Study | Social Media Collective | James Reads — May 12, 2015
[…] Jurgenson from Maryland and Snapchat wrote on Cyborgology (“in a fury“) that Facebook is intentionally “evading” its own role in the production of the […]
Critics Slam Facebook's 'Filter Bubble' Study — May 12, 2015
[…] Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than […]
Social scientists critique Facebook’s study claiming the news feed algorithm doesn’t lead to a filter bubble | Legally Sociable — May 12, 2015
[…] two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is […]
Facebook Study Says Users Control What They See, But Critics Disagree | Cesar Vela — May 12, 2015
[…] Jorgenson, a sociologist and researcher for Snapchat, argues that whether or not Facebook is fair and balanced in what it shows users is crucially […]
Los algoritmos y la burbuja de Facebook | Influensr — May 12, 2015
[…] Si quieren saber un poco más sobre la metodología llevada a acabo en el estudio, pueden leer este recomendado y muy completo ensayo: Facebook: Fair and Balanced […]
The Gilbane Advisor - The Omni Channel Paradox — May 12, 2015
[…] Internet is Hiding From You, but I’ve added a few others that are meaty. Read more here, and here, […]
Facebook Study Says Users Control What They See, But Critics Disagree | The H2O Standard — May 12, 2015
[…] Jurgenson, a sociologist and researcher for Snapchat, argues that whether or not Facebook is fair and balanced in what it shows users is crucially […]
Facebook Study Says Users Control What They See, But Critics Disagree - AltoSky - AltoSky — May 13, 2015
[…] Jurgenson, a sociologist and researcher for Snapchat, argues that whether or not Facebook is fair and balanced in what it shows users is crucially […]
What Can Feminism Teach Facebook Researchers? A Science Studies Primer » Cyborgology | James Reads — May 17, 2015
[…] of the study and media accounts of the study have already been expertly executed by Zeynep Tufecki, Nathan Jurgenson, and Christian Sandvig and I won’t repeat them. Instead I’d like to do a quick review of what […]
Critics Slam Facebook's 'Filter Bubble' Study - Tsepa — May 18, 2015
[…] Jurgenson also talks about this, and about how Facebook’s attempt to argue that its algorithm is somehow unbiased or neutral — and that the big problem is what users decide to click on and share — is disingenuous. The whole reason why some (including Tufekci, who has written about this before) are so concerned about algorithmic filtering is that users’ behavior is ultimately determined by that filtering. The two processes are symbiotic, so arguing that one is worse than the other makes no sense. […]
1p – Facebook: Fair and Balanced | Profit Goals — May 18, 2015
[…] http://thesocietypages.org/cyborgology/2015/05/07/facebook-fair-and-balanced/ […]
Facebook, qui nous influence le plus : l’algorithme ou nos a-priori ? | InternetActu — May 20, 2015
[…] Hull : il n’y a pas de neutralité des Big Data, quoique tente de nous faire croire Facebook. Pour le sociologue Nathan Jurgenson […]
Facebook, qui nous influence le plus : l’algorithme ou nos a-priori ? | Actu High Tech — May 20, 2015
[…] Hull : il n’y a pas de neutralité des Big Data, quoique tente de nous faire croire Facebook. Pour le sociologue Nathan Jurgenson […]
Reading 5/21/15 | Jillian C. York — May 21, 2015
[…] Facebook: fair and balanced, by Nathan Jurgenson in Cyborgology […]