Emotional Contagion is the idea that emotions spread throughout networks. If you are around happy people, you are more likely to be happy. If you are around gloomy people, you are likely to be glum.
The data scientists at Facebook set out to learn if text-based, nonverbal/non-face-to-face interactions had similar effects. They asked: Do emotions remain contagious within digitally mediated settings? They worked to answer this question experimentally by manipulating the emotional tenor of users’ News Feeds, and recording the results.
Public reaction was such that many expressed dismay that Facebook would 1) collect their data without asking and 2) manipulate their emotions.
I’m going to leave aside the ethics of Facebook’s data collection. It hits on an important but blurry issue of informed consent in light of Terms of Use agreements, and deserves a post all its own. Instead, I focus on the emotional manipulation, arguing that Facebook was already manipulating your emotions, and likely in ways far more effectual than algorithmically altering the emotional tenor of your News Feed.
First, here is an excerpt from their findings:
In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.
In brief, Facebook made either negative or positive emotions more prevalent in users’ News Feeds, and measured how this affected users’ emotionally expressive behaviors, as indicated by users’ own posts. In line with Emotional Contagion Theory, and in contrast to “technology disconnects us and makes us sad through comparison” hypotheses, they found that indeed, those exposed to happier content expressed higher rates of positive emotion, while those exposed to sadder content expressed higher rates of negative emotion.
Looking at the data, there are three points of particular interest:
- When positive posts were reduced in the News Feed, people used .01% fewer positive words in their own posts, while increasing the number of negative words they used by .04%.
- When negative posts were reduced in the News Feed, people used .07% fewer negative words in their own posts, while increasing the number of positive words by.06%.
- Prior to manipulation, 22.4% of posts contained negative words, as compared to 46.8% which contained positive words.
Let’s first look at points 1 and 2 — the effects of positive and negative content in users’ News Feeds. These effects, though significant and in the predicted direction, are really really tiny. None of the effects even approach 1%. In fact, the effects are all below .1%. That’s so little! The authors acknowledge the small effects, but defend them by translating these effects into raw numbers, reflecting “hundreds of thousands” of emotion-laden status updates per day. They don’t, however, acknowledge how their (and I quote) “massive” sample size of 689,003 increases the likelihood of finding significant results.
So what’s up with the tiny effects?
The answer, I argue, is that the structural affordances of Facebook are such users are far more likely to post positive content anyway. For instance, there is no dislike button, and emoticons are the primary means of visually expressing emotion. Concretely, when someone posts something sad, there is no canned way to respond, nor an adequate visual representation. Nobody wants to “Like” the death of someone’s grandmother, and a Frownie-Face emoticon seems decidedly out of place.
The emotional tenor of your News Feed is small potatoes compared to the effects of structural affordances. The affordances of Facebook buffer against variations in content. This is clear in point 3 above, in which positive posts far outnumbered negative posts, prior to any manipulation. The very small effects of experimental manipulations indicates that the overall emotional makeup of posts changed little after the study, even when positive content was artificially decreased.
So Facebook was already manipulating your emotions — our emotions — and our logical lines of action. We come to know ourselves by seeing what we do, and the selves we perform through social media become important mirrors with which we glean personal reflections. The affordances of Facebook therefore affect not just emotive expressions, but reflect back to users that they are the kind of people who express positive emotions.
Positive psychologists would say this is good; it’s a way in which Facebook helps its users achieve personal happiness. Critical theorists would disagree, arguing that Facebook’s emotional guidance is a capitalist tool which stifles rightful anger, indignation, and mobilization towards social justice. In any case, Facebook is not, nor ever was, emotionally neutral.
Jenny Davis is an Assistant Professor of Sociology at James Madison University and a weekly contributor to Cyborgology, where this post originally appeared. You can follow her on Twitter.
Comments 13
Bill R — July 2, 2014
The results appear to be significant but meaningless, more interesting for use in explaining the artifacts of large sample sizes and "significance" definitions than in theorizing about the content of the study.
Amber Largo — July 2, 2014
The question also arises as to whether people exposed to more positive posts are *feeling* more positive, or just feeling the pressure to make their lives seem positive too. Negative posters may feel safer expressing their negative emotions when they see others doing it too.
Heather — July 2, 2014
this is interesting and all, but I'm stuck on the part where Facebook is curating my newsfeed for me. Where is the "just show me everything the people I connect with on facebook post" setting?
guest — July 2, 2014
I think I'm the only one feeling this way - I'm less irritated by the data-mining, and far more by the fact that I use Facebook to do one job - tell me what my friends post. I purposely keep my friends list small so that I can read everything that everyone writes and keep up. It makes me MUCH angrier to know that Facebook just decides not to show me stuff (not even just for the experiment, apparently it does this all the time) that could have been important. How detailed was this program? If I was in the 'happy' group, could FB have refused to show me a post from a friend in trouble who could have needed me?
LLA — July 3, 2014
Ok, but what about jealousy and "FOMO" (The dreaded social-media caused Fear Of Missing Out) Where all your friends seem to be having the most wonderful, perfect, blessed lives full of vacations and weddings and coffee instagrams while your life just sucks by comparison?
Lots of people feel more depressed, jealous, or less confident when they see their friends *seemingly* doing well with new jobs and new cars, etc. It often seems like a social bragging competition, rather than positive posts curating a positive emotional environment.
And indeed, social media bragging, or Life-washing to make your life seem better seem to be rampant on various feeds. it's almost as if some people are less concerned with living happy, healthy lives than having a social media image where they appear to do so.
Facebook | Piškotarna — July 3, 2014
[…] Novica: Facebook že dalj časa manipulira z vašimi čustvi; […]
Is Facebook Experimenting On You? | SociologyInFocus — July 7, 2014
[…] Facebook is manipulating your emotions. That was the gist of the news stories that broke this week after Facebook published a study on emotional contagion. As Dr. Jenny Davis said in her excellent summary of the study, […]