{"id":19947,"date":"2015-05-07T14:03:01","date_gmt":"2015-05-07T18:03:01","guid":{"rendered":"http:\/\/thesocietypages.org\/cyborgology\/?p=19947"},"modified":"2015-05-07T20:42:16","modified_gmt":"2015-05-08T00:42:16","slug":"facebook-fair-and-balanced","status":"publish","type":"post","link":"https:\/\/thesocietypages.org\/cyborgology\/2015\/05\/07\/facebook-fair-and-balanced\/","title":{"rendered":"Facebook: Fair and Balanced"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-large wp-image-19948\" src=\"https:\/\/thesocietypages.org\/cyborgology\/files\/2015\/05\/a-500x274.jpg\" alt=\"a\" width=\"500\" height=\"274\" srcset=\"https:\/\/thesocietypages.org\/cyborgology\/files\/2015\/05\/a-500x274.jpg 500w, https:\/\/thesocietypages.org\/cyborgology\/files\/2015\/05\/a-250x137.jpg 250w, https:\/\/thesocietypages.org\/cyborgology\/files\/2015\/05\/a-400x220.jpg 400w, https:\/\/thesocietypages.org\/cyborgology\/files\/2015\/05\/a.jpg 800w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><\/p>\n<p>The most crucial thing people forget about social media, all technologies, is that certain people with certain politics, insecurities, and financial interests <em>structure<\/em> them. On an abstract level, yeah, we may all know that these sites are shaped, designed, and controlled by specific\u00a0humans. But so much of the rhetoric around\u00a0code, \u201cbig\u201d data, and data science research continues to promote a fallacy that the way sites operate is almost natural, that they are simply giving users what they want, which then downplays their own interests and role and responsibility in <em>structuring<\/em> what happens. The greatest success of \u201cbig\u201d data so far\u00a0has been for those with that data\u00a0to sell their interests as neutral.<\/p>\n<p>Today, Facebook researchers released a report in <em>Science<\/em>\u00a0on the flow of ideological news content on their site. &#8220;<a href=\"http:\/\/www.sciencemag.org\/content\/early\/2015\/05\/06\/science.aaa1160.full\" target=\"_blank\">Exposure to ideologically diverse news and opinion on Facebook<\/a>&#8221; by Eytan Bakshy, Solomon Messing, and Lada Adamic (all Facebook researchers) enters into the debate around whether social media in general, and Facebook in particular, locks users into a so-called \u201cfilter bubble\u201d, seeing only what one\u00a0wants and is predisposed to agree with and limiting exposure to outside and conflicting voices, information, and opinions. And just like <a href=\"http:\/\/pressthink.org\/2015\/04\/its-not-that-we-control-newsfeed-you-control-newsfeed-facebook-please-stop-with-this\/\" target=\"_blank\">Facebook&#8217;s director of news recently ignored\u00a0the company&#8217;s journalistic role shaping\u00a0our news ecosystem<\/a>, Facebook&#8217;s researchers make this paper about minimizing their\u00a0role in structuring what a user sees and posts. I&#8217;ve just read the study, but I already had some thoughts about this bigger ideological push since the journalism event as it relates to my bigger project <a href=\"http:\/\/thenewinquiry.com\/essays\/view-from-nowhere\/\" target=\"_blank\">describing\u00a0contemporary data science as a sort of neo-positivism<\/a>. I&#8217;d like to put some of my thoughts connecting it all here.<\/p>\n<p><!--more--><\/p>\n<p>&nbsp;<\/p>\n<p><strong>The Study Itself <\/strong><em>(method\u00a0wonkery, skip to the next section if that sounds boring)<\/em><\/p>\n<p>Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in <a href=\"http:\/\/www.sciencemag.org\/content\/early\/2015\/05\/06\/science.aaa1160\/suppl\/DC1\" target=\"_blank\">a separate supporting materials appendix<\/a> but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don\u2019t fall for the Big-N trick, we don\u2019t how this 9% is different from Facebook in general. We cannot treat this as a sample of \u201cFacebook users\u201d or even \u201cFacebook liberals and conservatives\u201d, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn\u2019t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don\u2019t explicitly list their political orientation on the site.\u00a0Facebook\u2019s report talks about <em>Facebook users<\/em>, which isn\u2019t accurate. All the findings should be understood to be about <em>Facebook users who also put their political orientation on their profiles<\/em>, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways.\u00a0The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.<\/p>\n<p>So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION\u00a0ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on\u00a0as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is <em>politically cross cutting<\/em>, that is, shared by someone on the right and then seen by someone on the left and <em>vice versa<\/em>. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they\u2019d already agree with as opposed to more diverse and challenging \u201ccross cutting\u201d information.<\/p>\n<p>The Facebook researchers looked at how much, specifically, <em>the newsfeed algorithm<\/em> promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.<\/p>\n<p>Facebook published this finding, that <em>the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn\u2019t there<\/em>, ultimately because Facebook wants to make the case that their algorithm isn\u2019t as big a factor in this political confirmation bias as people\u2019s individual choices, stating, \u201cindividual choice has a larger role in limiting exposure to ideologically cross cutting content.\u201d The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.<\/p>\n<p>The report concludes that, \u201cwe conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms\u201d.\u00a0Nooo\u00a0this just simply isn\u2019t the case.<\/p>\n<p>First, and most obvious, and please tell me if I am missing something here because it seems so obvious, this statement only holds true for the conservatives (17% less by choice, 5% by algorithm). The reduction in ideologically cross cutting content from the algorithm <em>is greater than<\/em> individual choice for liberals (6% by choice, 8% by algorithm). Second, to pick up on my annoyance above, note how they didn&#8217;t say this was true for the rare people who explicitly profile-self-identify, but for the whole context of Facebook. That&#8217;s misleading.<\/p>\n<p>But even bracketing both of those issues, the next problem is that individual users choosing news they agree with and Facebook\u2019s algorithm providing what those individuals already agree with is not either-or but <em>additive. <\/em>That people seek that which they agree with is a pretty well-established social-psychological trend. We didn\u2019t need this report to confirm that. As if anyone critiquing how Facebook structures our information flows ever strawpersoned themselves into saying individual choice wasn\u2019t important too. What&#8217;s important is the\u00a0finding that, in addition to confirmation biasy individuals, the Facebook newsfeed algorithm exacerbates and furthers this filter-bubble bias over and above the baseline.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Fair And Balanced<\/strong><\/p>\n<p>\u201cFair and Balanced\u201d is of course the famous Fox News punch line, which also stands for the fallacy of being politically disinterested and how those structuring what we see have an interest in pretending they are <a href=\"https:\/\/thesocietypages.org\/cyborgology\/2014\/06\/09\/short-comment-on-facebook-as-methodologically-more-natural\/\" target=\"_blank\">neutral and fair, objective and balanced<\/a>. The joke is that the myth of being politically disinterested is a very familiar and powerful interest.<\/p>\n<p>These lines from the report are nagging me, \u201cwe conclusively establish that on average in the context of Facebook, individual choices [matter] more than algorithms\u201d, which is more than just incorrect and it is more than just journalist bait but is also indicative\u00a0of something larger. Also, &#8220;the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.\u201d This is of the same politics and logic as \u201cguns don\u2019t kill people, people kill people\u201d, <a href=\"https:\/\/thesocietypages.org\/cyborgology\/2012\/12\/17\/tech-isnt-neutral-guns-cause-tragedies\/\" target=\"_blank\">the same fallacy that technologies are neutral<\/a>, that they are \u201cjust tools\u201d, completely neglecting, for example, how different people appear killable with a gun in hand. This fallacy of neutrality is an ideological stance: even against their findings,\u00a0<span style=\"line-height: 1.5;\">Facebook wants to downplay the role of their own sorting algorithms. They want to\u00a0sell their algorithmic structure as impartial, that by simply giving people\u00a0what they want the\u00a0<\/span>algorithmically-sorted information is the work of users and not the site itself. The Facebook researchers describe their algorithm as such,<\/p>\n<blockquote><p>The order in which users see stories in the News Feed depends on many factors, including how often the viewer visits Facebook, how much they interact with certain friends, and how often users have clicked on links to certain websites in News Feed in the past<\/p><\/blockquote>\n<p>What isn\u2019t mentioned is how a post that is \u201cliked\u201d more is more likely to perform well in the algorithm. Also left out of this description is that the newsfeed is also sorted based on what people are willing to pay. The order of Facebook\u2019s newsfeed is partly for sale. It\u2019s almost their entire business model, and a model that relates directly to the variable they are describing. In the appendix notes, the Facebook researchers state that,<\/p>\n<blockquote><p>Some positions\u2014particularly the second position of the News Feed\u2014are often allocated to sponsored content, which may include links to articles shared by friends which are associated with websites associated with a particular advertiser. Since we aim to characterize interactions with all hard content shared by friends, such links are included in our analyses. These links appear to be more ideologically consistent with the viewers; however further investigation is beyond the scope of this work.<\/p><\/blockquote>\n<p>It seems like sponsored content might very well be furthering the so-called filter bubble, but the researchers took this out of the scope of the study, okay, but that this sponsored content was not even included in how the algorithm works in the report itself is suspect.**<\/p>\n<p>Further, this whole business of conceptually separating the influence of the algorithm versus individual choices willfully\u00a0misunderstands what algorithms are and what they do. Algorithms are made to capture, analyze, and re-adjust individual behavior in ways that serve particular ends. Individual choice is partly a result of how the algorithm teaches us, and the algorithm itself is dynamic code that reacts to and changes with individual choice.\u00a0Neither the algorithm or individual choice can be understood without the other.<\/p>\n<p>For example, that the newsfeed algorithm suppresses ideologically cross cutting news to a non-trivial degree teaches individuals to not share as much cross cutting news. By making the newsfeed an algorithm, Facebook enters users into a competition to be seen. If you don\u2019t get &#8220;likes&#8221; and attention with what you share, your content will subsequently be seen even less, and thus you and your voice and presence is lessened. To post without likes means few are seeing your post, so there is little point in posting. We want likes because we want to be seen. We see what gets likes and adjust accordingly. Each like we give, receive, or even see very subtly and incrementally acts as a sort of social training, each a tiny cut that carves deeply in aggregate. This is just one way the Facebook algorithm influences\u00a0the individual choices we make, to post, click, click with the intention of reposting, and so on. And it is no coincidence that when Facebook described their algorithm in this report, they left out the biggest ways Facebook itself makes decisions that shape what we see:the position in the newsfeed something that can be bought or also having a &#8220;like&#8221; button and ranking content on it\u00a0(no dislike, no important, etc, which would all change our individual choices).<\/p>\n<p>To ignore these ways the site is structured and to instead be seen as a neutral platform means to not have responsibility, to offload the blame for what users see or don\u2019t see onto on the users. The politics and motives that go into structuring the site and therefore its users don\u2019t have to be questioned if they are not acknowledged. This ideological push by Facebook to downplay their own role in shaping their own site was also on display last month at the International Journalism Festival in Italy, featuring Facebook\u2019s head of news.\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=NwLmqhNv7WE#t=45m50s\">You can watch him evade any role in structuring the news ecosystem<\/a>. <a href=\"http:\/\/pressthink.org\/2015\/04\/its-not-that-we-control-newsfeed-you-control-newsfeed-facebook-please-stop-with-this\/\">NYU journalism professor Jay Rosen summarizes Facebook\u2019s message<\/a>,<\/p>\n<blockquote><p>1. \u201cIt\u2019s not that we control NewsFeed, you control NewsFeed by what you tell us that you\u2019re interested in.\u201d<\/p>\n<p>2. Facebook should not be anyone\u2019s primary news source or experience. It should be a supplement to seeking out news yourself with direct suppliers. \u201cComplementary\u201d was the word he used several times.<\/p>\n<p>3. Facebook is accountable to its users for creating a great experience. That describes the kind of accountability it has. End of story.<\/p><\/blockquote>\n<p>Rosen correctly states, \u201cIt simply\u00a0isn\u2019t true\u00a0that an algorithmic filter can be designed to remove the designers from the equation.&#8221;<\/p>\n<p>Facebook orders and ranks news information, which is doing the work of journalism, but they refuse to acknowledge they are doing the work of journalism. Facebook cannot take its own role in news seriously, and they cannot take journalism itself seriously, if they are unwilling to admit the degree to which they shape how news\u00a0appears\u00a0on the site. The most dangerous journalism is journalism that doesn\u2019t see itself as such.***<\/p>\n<p>Facebook&#8217;s line, \u201cit\u2019s not that we control NewsFeed, you control NewsFeed,\u201d exactly parallels the\u00a0ideological stance that informs the Facebook researchers attempt to downplay Facebook\u2019s own role in sorting what people see. Coincidence? This erasing of their role in the structuring of personal, social, and civic life is being repeated in full force by the company, seemingly\u00a0like when politicians are given a party line to repeat on Sunday news shows.****<\/p>\n<p>Power and control are most efficiently maintained when they are made invisible. Facebook&#8217;s\u00a0ideological push to dismiss the very real ways they structure what users see and do is the company\u00a0attempting to simultaneously embrace control and evade responsibility. Their news team doesn\u2019t need to be competent in journalism because they don\u2019t see themselves as doing journalism. But Facebook <em>is <\/em>doing journalism, and the way they code their algorithms and the rest of the site <em>is<\/em>\u00a0structuring and shaping personal, social, and civic life. Like it or not, we&#8217;ve collectively handed very important civic roles\u00a0to social media companies, the one I work for included, and a real danger is that we can&#8217;t hope for them to be competent at these jobs when they wont even admit to doing them.<\/p>\n<p><em>Nathan is on <a href=\"https:\/\/twitter.com\/nathanjurgenson\" target=\"_blank\">Twitter<\/a> and <a href=\"http:\/\/nathanjurgenson.com\/\" target=\"_blank\">Tumblr<\/a><\/em><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><em>*Note here that the Facebook researchers could have easily avoided this problem by inferring politically orientation from what any users posts instead of only looking at those who state it explicitly. This would have resulted a stronger research design, but also very bad press, probably something like &#8220;Facebook Is Trying to Swing the Election!&#8221;, which also may not be wrong. Facebook went for the less invasive measure here, but, and this is my guess nothing more, they likely ran and haven&#8217;t published a version of this study with nearly all users by inferring political orientation, which is not difficult to do. \u00a0<\/em><\/p>\n<p><em>**I also work for a social media company (Snapchat), I understand the conflicts involved\u00a0here, but there&#8217;s no reason that such an important variable in how the newsfeed is sorted should be left out of the report about the consequences of such sorting<\/em><\/p>\n<p><em>***Some may not agree that what the newsfeed algorithm does is &#8220;journalism&#8221;. I think calling it journalism is provocative and ultimately correct. First, I don&#8217;t mean they are on the ground reporting, that is only one part of the work of\u00a0journalism, and Facebook&#8217;s role is another part. If you agree with me that any algorithm, including this one, is never neutral, disinterested, and objective\u00a0but instead built by humans with politics, interests, and insecurities, then this type of sorting of news information is certainly the work of journalism. Editing, curating, sorting, and ranking of news information are all part of journalism and Facebook is making these decisions that influence how news is being produced, displayed, and most importantly ranked, that is, what is seen and what isn&#8217;t. Sorting a NEWSfeed is doing the work of journalism.<\/em><\/p>\n<p><em>***Of course, I don\u2019t know if this is stated explicitly for employees to repeat, or if it is just this deep in the company\u2019s ideology, and I don&#8217;t know which would be more troubling<\/em><\/p>\n<p><em>Further reading:\u00a0Some\u00a0reactions from other researchers. Good to know I wasn&#8217;t alone in thinking this was very poor work.<\/em><\/p>\n<p><em><a href=\"https:\/\/medium.com\/message\/how-facebook-s-algorithm-suppresses-content-diversity-modestly-how-the-newsfeed-rules-the-clicks-b5f8a4bb7bab\" target=\"_blank\">Zeynep Tufekci,\u00a0How Facebook\u2019s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks<\/a><\/em><\/p>\n<p><em><a href=\"http:\/\/crookedtimber.org\/2015\/05\/07\/why-doesnt-science-publish-important-methods-info-prominently\/\" target=\"_blank\">Eszter Hargittai, Why doesn\u2019t Science publish important methods info prominently?<\/a><\/em><\/p>\n<p><em><a href=\"http:\/\/socialmediacollective.org\/2015\/05\/07\/the-facebook-its-not-our-fault-study\/\" target=\"_blank\">Christian Sandvig, The Facebook \u201cIt\u2019s Not Our Fault\u201d Study<\/a><\/em><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>With new research, Facebook evades responsibility by insisting that they are hosts to discussions, not the editors of information<\/p>\n","protected":false},"author":559,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[9967],"tags":[18606,13319,36361,942,38,36362,431],"class_list":["post-19947","post","type-post","status-publish","format-standard","hentry","category-commentary","tag-algorithms","tag-big-data","tag-data-science","tag-facebook","tag-methods","tag-positivism","tag-research"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/19947","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/users\/559"}],"replies":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/comments?post=19947"}],"version-history":[{"count":19,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/19947\/revisions"}],"predecessor-version":[{"id":19973,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/19947\/revisions\/19973"}],"wp:attachment":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/media?parent=19947"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/categories?post=19947"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/tags?post=19947"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}