Emotional Contagion is the idea that emotions spread throughout networks. If you are around happy people, you are more likely to be happy. If you are around gloomy people, you are likely to be glum.
The data scientists at Facebook set out to learn if text-based, nonverbal/non-face-to-face interactions had similar effects. They asked: Do emotions remain contagious within digitally mediated settings? They worked to answer this question experimentally by manipulating the emotional tenor of users’ News Feeds, and recording the results.
Public reaction was such that many expressed dismay that Facebook would 1) collect their data without asking and 2) manipulate their emotions.
I’m going to leave aside the ethics of Facebook’s data collection. It hits on an important but blurry issue of informed consent in light of Terms of Use agreements, and deserves a post all its own. Instead, I focus on the emotional manipulation, arguing that Facebook was already manipulating your emotions, and likely in ways far more effectual than algorithmically altering the emotional tenor of your News Feed.
Here is there full report. And here is the abstract:
Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.
In brief, Facebook made either negative or positive emotions more prevalent in users’ News Feeds, and measured how this affected users’ emotionally expressive behaviors, as indicated by users’ own posts. In line with Emotional Contagion Theory, and in contrast to “technology disconnects us and makes us sad through comparison” hypotheses, they found that indeed, those exposed to happier content expressed higher rates of positive emotion, while those exposed to sadder content expressed higher rates of negative emotion.
Looking at the data, there are three points of particular interest:
- When positive posts were reduced in the News Feed, people used .01% fewer positive words in their own posts, while increasing the number of negative words they used by .04%.
- When negative posts were reduced in the News Feed, people used .07% fewer negative words in their own posts, while increasing the number of positive words by.06%.
- Prior to manipulation, 22.4% of posts contained negative words, as compared to 46.8% which contained positive words.
Let’s first look at points 1 and 2—the effects of positive and negative content in users’ News Feeds. These effects, though significant and in the predicted direction, are really really tiny[i]. None of the effects even approach 1%. In fact, the effects are all below .1%. That’s so little!! The authors acknowledge the small effects, but defend them by translating these effects into raw numbers, reflecting “hundreds of thousands” of emotion-laden status updates per day. They don’t, however, acknowledge how their (and I quote) “massive” sample size of 689,003 increases the likelihood of finding significant results.
So what’s up with the tiny effects?
The answer, I argue, is that the structural affordances of Facebook are such users are far more likely to post positive content anyway. For instance, there is no dislike button, and emoticons are the primary means of visually expressing emotion. Concretely, when someone posts something sad, there is no canned way to respond, nor an adequate visual representation. Nobody wants to “Like” the death of someone’s grandmother, and a Frownie-Face emoticon seems decidedly out of place.
The emotional tenor of your News Feed is small potatoes compared to the effects of structural affordances. The affordances of Facebook buffer against variations in content. This is clear in point 3 above, in which positive posts far outnumbered negative posts, prior to any manipulation. The very small effects of experimental manipulations indicates that the overall emotional makeup of posts changed little after the study, even when positive content was artificially decreased.
So Facebook was already manipulating your emotions—our emotions—and our logical lines of action. We come to know ourselves by seeing what we do, and the selves we perform through social media become important mirrors with which we glean personal reflections. The affordances of Facebook therefore affect not just emotive expressions, but reflect back to users that they are the kind of people who express positive emotions.
Positive psychologists would say this is good; it’s a way in which Facebook helps its users achieve personal happiness. Critical theorists would disagree, arguing that Facebook’s emotional guidance is a capitalist tool which stifles rightful anger, indignation, and mobilization towards social justice. In any case, Facebook is not, nor ever was, emotionally neutral.
Jenny Davis is a weekly contributor for Cyborgology and an Assistant Professor of Sociology at James Madison University. Follow Jenny on Twitter: @Jenny_L_Davis
[i] Nathan Jurgenson pointed out just how tiny the effects were in an email thread. I then fumbled through an explanation that manifested in this post.
Headline pic via:
http://upload.wikimedia.org/wikipedia/commons/1/15/Lab_coats.jpg
Comments 23
Facebook has Always Manipulated Your Emotions | digital images — June 30, 2014
[…] http://thesocietypages.org/cyborgology/2014/06/30/facebook-has-always-manipulated-your-emotions/ […]
Jan — June 30, 2014
Great read!
I should preface this by saying I'm just a dude on the internet with minimal formal background in psychology and sociology ... disregard everything I say.
Ok, now, I think there might be a possibly even more significant factor which skews Facebook post content towards positive emotions and one not controlled by or unique to Facebook. That factor being the nature of social media as a public and curated display of ourselves. The degree to which users are conscious of this varies, but we judge our peers and are judged by them based on the content of our profiles and because of that we self-censor posts which would portray us negatively - we hide or ignore the bad and brag about the good.
I view the structure of Facebook to be more of a reflection of this nature than its cause, although it is somewhat of a positive feedback loop (pun intended) where the structure reinforces the nature and vice versa.
syed ali — July 2, 2014
this is the best thing i've read on the facebook manipulation. good stuff!
Your Facebook Experience Was Never Emotionally Neutral | Cesar Vela — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Your Facebook Experience Was Never Emotionally Neutral | Social Dashboard — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Connect IMS - Integrated Marketing Solutions || Your Facebook Experience Was Never Emotionally Neutral — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Today in Technology July 2, 2014 | Tech Fann.com — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Your Facebook Experience Was Never Emotionally Neutral | Omaha Sun Times — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
전세계의 최신 영어뉴스 듣기 - 보이스뉴스 잉글리쉬 — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Your Facebook Experience Was Never Emotionally Neutral - AltoSky - AltoSky — July 2, 2014
[…] post originally appeared on the Cyborgology Blog where Jenny Davis is a regular […]
Can Facebook Be Governed? » Cyborgology — July 3, 2014
[…] on us in a way that we really didn’t like. Its important to frame it that way because, as Jenny Davis pointed out earlier this week, they experiment on us all the time and in much more invasive ways. […]
Facebook’s Controversial Experiment: Big Tech Is the New Big Pharma - TIME — July 3, 2014
[…] are shaping and manipulating your experiences on the site all the time. Facebook’s very design encourages sharing positive emotions more than negative ones, and its mysterious algorithms pick […]
Facebook’s Controversial Experiment: Big Tech Is the New Big Pharma - TIME | Hihid News — July 4, 2014
[…] are shaping and manipulating your experiences on the site all the time. Facebook’s very design encourages sharing positive emotions more than negative ones, and its mysterious algorithms pick […]
Overstating and Understating the Influence of Facebook | Science of News — July 4, 2014
[…] (I assume my profile is sitting on a server somewhere.) As Jenny Davis wisely points out, some of these standards are essentially hard-wired in to the Facebook interface. The presence of a like button and the relative difficulty of expressing certain emotions like […]
Facebook’s Controversial Experiment: Big Tech Is the New Big Pharma – TIME | Latest News Portal Info — July 4, 2014
[…] actually are shaping and manipulating your experiences on the site all the time. Facebook’s very design encourages sharing positive emotions more than negative ones, and its mysterious algorithms pick […]
AcrossTheFader.ORG | Facebook’s Controversial Experiment: Big Tech Is the New Big Pharma — July 7, 2014
[…] actually are shaping and manipulating your experiences on the site all the time. Facebook’s very design encourages sharing positive emotions more than negative ones, and its mysterious algorithms pick […]
Facebook’s Controversial Experiment: Big Tech Is the New Big PharmaFind Latest News | Find Latest News — July 8, 2014
[…] indeed are moulding and utilizing your practice on a site all a time. Facebook’s unequivocally design encourages pity certain emotions some-more than disastrous ones, and a puzzling algorithms collect […]
Practicing With Python at OiiSDP | Stacy Blasiola — July 14, 2014
[…] Accessing results for: http://thesocietypages.org/cyborgology/2014/06/30/facebook-has-always-manipulated-your-emotions/It was shared 95 […]
Newsflash: Facebook has Always Manipulated Your Emotions » Sociological Images — July 29, 2014
[…] of Sociology at James Madison University and a weekly contributor to Cyborgology, where this post originally appeared. You can follow her on […]
Posts About the Facebook Research Controversy I Can’t Seem To Find Time To Read — August 4, 2014
[…] http://thesocietypages.org/cyborgology/2014/06/30/facebook-has-always-manipulated-your-emotions/ […]
#IceBucketChallenge v. #Ferguson - Leftward ThinkingLeftward Thinking — September 1, 2014
[…] First, it’s possible that Facebook recognized #Ferguson posts as ‘downer posts’ and buried them for that reason, as they did an a secret recent experiment. […]