I also did it because I was curious to see what it would toss up. What does Facebook consider “my year”? What does it think is noteworthy? I didn’t know, at the time, how the specific algorithms involved specifically operated. So I just hit yeah sure and waited to see what it spit back at me.
Makeup selfies, mostly. So… that was 2014 for me. Okay.
(It sort of was)
But the other thing I noticed – which is old news by now – was that a huge amount of what I did and said on Facebook in 2014 – much of which I consider pretty important – just wasn’t there at all. What I was looking at was not 2014 for me (apart from the makeup selfies). Algorithms took my own vaguely defined narrative regarding what I understood as my year and imposed their own meaning on it before presenting it to me as if I should be pleased.
Some of my friends seemed to be. My reaction was mostly “…huh.”
Which is a far cry from Eric Meyer, for whom Facebook selected a photo of his dead daughter as the cover for his year in review, which – for obvious reasons – was enormously painful to him. The term he used in reference to this is “algorithmic cruelty”, which has since picked up a lot of usage because it works really well as a descriptor. But I’m specifically interested in why this happened at all, not necessarily in terms of these particular algorithms or even algorithms in general but more in terms of why we do these things. Why we feel the need to have them. Facebook probably wouldn’t have done this at all if they didn’t perceive that it was something people might be inclined to do anyway.
I think a lot of us do some kind of wrap-up or review at the end of every year. We want to do some kind of stock-taking. I think there are a number of emotional and psychological reasons for this, and I don’t think they’re just the individual self engaging in monologue, but I think primary among these reasons is the desire to solidify a self-narrative, to understand who we are and what this year has made us, and where we might go next in the new one. Which is obviously extremely arbitrary storytelling – we’re never as coherent and self-consistent as we like to pretend we are. Self-narratives may not be false, but they’re always biased, and they’re always massive oversimplifications.
Our stories don’t always make a whole lot of sense. We don’t like to admit that to anyone, least of all ourselves.
The end of a year marks a point of wholeness, a roundness, at which this kind of storytelling seems appropriate.
The thing is, when we do these kinds of self-motivated years-in-review, we’re the ones setting the terms, and we’re the ones who want to understand ourselves. We get to decide what’s important and what’s worthy of being included. We have the power to curate our own histories.
That power is not equally distributed. The question of who has the right and the ability to tell their own stories is profoundly shaped by social power and inequality, and this is true on both micro and macro levels. The personal histories of individuals are stolen and erased by the same processes that do the same to entire cultures. The question is always who gets to tell their own stories and why, but it’s also who doesn’t, and who’s preventing them from doing so.
This isn’t just about algorithms being thoughtlessly cruel, having the problems that algorithms have an enormous amount of the time. This is about something powerful – Facebook – claiming that power in order to tell us a story about ourselves, the terms of which we don’t really control. I’m not saying this is actively oppressive in the same way as what I described above. I am saying it’s problematic, and it’s worth paying attention to in this sense.
Powerful institutions deciding the terms under which our lives are arranged and understood isn’t new. What sets this apart, in my opinion, is that it’s a story Facebook is telling to you, and to your friends. Here, this is what you look like. This is who you are.
I haven’t begun to tease out all the implications of this, but I think there are a lot of them and I think they’re troubling. Storytelling of this kind – this is who I am, this is what I’m about, this is where I’ve been and where I’m going – is historically profoundly communal. Generally, even constrained by the storytelling medium in which we work, we have a fair amount of control over what that story looks like. In some cases we have more than we did before. Not in this case.
So. Yeah.
Man, my makeup was awesome this year.
Sarah is self-absorbed on Twitter – @dynamicsymmetry
Comments 9
“Here’s 2014!” “…Really?” – Facebook’s algorithmic storytelling - Treat Them Better — December 29, 2014
[…] “Here’s 2014!” “…Really?” – Facebook’s algorithmic storytelling […]
Comradde PhysioProffe — December 29, 2014
I'm an old fucker, so I exchange physical holiday cards by US mail every year with many dozens of people. Only a very few of them include "year-in-review" letters (I don't), and I hate those fucken things and never read them. The idea that Facebook is forcing every Facebook user to publish such a letter without even controlling its content (if I'm understanding what they're doing correctly) is deeply offensive to me.
drcab1e — December 30, 2014
Great piece, And I'd normally be saying so profusley in your mentions but twitter holiday.
I love the idea of the distress/discomfort coming from having another entities narrative forced on you, rather than from anything inherently distressing or unfomfortable in the content of this narrative(though obviously this plays a part).
Any idea where I should head for more reading/work on this? Accademic/non-accademic/art/tumblr rants/etc?
(also your selfie and makeup games never let us down)
“Here’s 2014!” “…Really?” – Facebook’s algorithmic storytelling | Things that I liked enough to 'save for later' that maybe you'll like, too. — December 31, 2014
[…] http://thesocietypages.org/cyborgology/2014/12/29/heres-2014-really-facebooks-algorithmic-storytelli… […]
Jill Walker Rettberg — December 31, 2014
Algorithms telling our stories FOR us is a major point in my book, Seeing Ourselves Through Technology: How We Use Blogs, Selfies, and Wearble Devices to See and Sanpe Ourselves (Palgrave 2014). You can buy it in print at your favorite online bookstore or download for free as it's published on Palgrave, open source: http://jilltxt.net/books or http://www.palgraveconnect.com/pc/doifinder/10.1057/9781137476661
Or for a free kindle version http://www.amazon.com/Seeing-Ourselves-Through-Technology-Wearable-ebook/dp/B00O4CHBKM
I wrote about last year's Facebook year in review - remember the videos they made? See for instance p 10, 24, 31, 46, although I discuss the issue of what I call "Automated Diaries" more at depth in the chapter of that name where I talk about lifelogging apps and cameras like the Narrative Clip that take a photo every 30 seconds and promise to algorithmically determine the photos that best tell the story of your day. The idea that algorithms can tell the truth about us better than we can ourselves is becoming very pervasive, especially in quantified self movement, but also elsewhere.
Reading your piece, though, Sarah, I'm thinking maybe it's not just algorithms like Facebooks Year In Review but just as much the assumption that we would even WANT such an end of year story. I'm guessing that might be quite culturally specific? My family never did this - is the year in review holiday letter an American thing? A west coast US thing? An Anglo-American thing? A Weetern thing? Obviously very far from ALL Americans do, and I know that some Norwegians do it. I see craft blogs and some other blogs do end of year wrap ups of the best posts. Is this a genre inherited from personal end of year letters or magazines and tv shows that do end of year summaries of the best photos etc? Maybe someone did a history of end of year summaries, or of holiday cards (why DO we send holiday cards, anyway? Why no some other time of year?) Clearly there's a sense of ending something provisionally - like a chapter not a life - and beginning something new - but is it connected to an era of serial storytelling? Print and then electronic and the. Digital? Did people make New Years resolution in the Middle Ages?
Jill Walker Rettberg — December 31, 2014
Ugh, I must have not closed a link in that comment, Sarah, if you're able to fix it is much appreciate it.
From the world pool: December 31, 2014 | — December 31, 2014
[…] “Here’s 2014!” “…Really?” – Facebook’s algorithmic storytelling. “This isn’t just about algorithms being thoughtlessly cruel, having the problems that algorithms have an enormous amount of the time. This is about something powerful – Facebook – claiming that power in order to tell us a story about ourselves, the terms of which we don’t really control.” […]
Philip Gedarovich — March 26, 2015
Hi Sara,
I'm an interaction design studies student who has recently come across a number of your articles and I must say, first of all, they are really on point and engaging, and completely align with the media theory research I am doing for my Masters. Awesome stuff, please keep writing!
I too am deeply concerned about Facebook's decision to publish these "year in reviews" in which you have no say in the content generated. I've been speaking with my colleagues quite a bit lately about the impact and repercussions of algorithms, and this is a perfect example of the intent of the algorithm creating an unexpected, negative response. While slightly off topic, I often wonder who is to blame in these situations, the corporation behind the algorithm, the programmer building it, or the algorithm itself (in this case I think we can definitely blame Facebook). It may be odd to think of a non-sentient non-subjective "emotionless" algorithm as being the thing to blame (I mean, it's like blaming your vacuum robot for not cleaning up all the crumbs from under your refrigerator). But what happens when the algorithms get so complex that the original programmers, designers, and corporate content creators have no say in their functioning anymore? Theoretically, algorithms do "evolve" in the sense that they "learn" and adjust their programming dynamically, in some cases on their own. They are built to do so. If that is the case, what does it mean to hold the algorithm accountable? I wonder if in the future Zuckerberg or whoever will try to use this as a defense against a lawsuit in court. "The algorithm grew beyond our control. It's not our fault."
Food for thought :)