TW: discussion of gun violence. I do not provide any detailed descriptions of violent acts, nor do I use any slurs. Some of the links provided below do reference slurs, misogyny, racism, and homophobia.
The mass shooting that took place at Oregon’s Umpqua Community College on October 1st was simultaneously horrifying and unsurprising. As mass shootings, particularly at schools, become more and more the norm, the desperate search for answers continues. Gun control, mental health, school security, and more recently “toxic masculinity” are often cited as the underlying factors at work in these acts. Despite pleas from criminologists, psychologists, and even some media outlets to stop publicizing the identity of mass shooters—thought to be a significant motivation for these acts—each new shooting comes to dominate news media coverage for days, if not weeks, after the incident.more...
I am an invisible man. No, I am not a spook like those who haunted Edgar Allan Poe; nor am I one of your Hollywood-movie ectoplasms. I am a man of substance, of flesh and bone, fiber and liquids — and I might even be said to possess a mind. I am invisible, understand, simply because people refuse to see me. Like the bodiless heads you see sometimes in circus sideshows, it is as though I have been surrounded by mirrors of hard, distorting glass. When they approach me they see only my surroundings, themselves, or figments of their imagination — indeed, everything and anything except me…It is sometimes advantageous to be unseen, although it is most often rather wearing on the nerves. Then too, you’re constantly being bumped against by those of poor vision…It’s when you feel like this that, out of resentment, you begin to bump people back. And, let me confess, you feel that way most of the time. You ache with the need to convince yourself that you do exist in the real world, that you’re a part of all the sound and anguish, and you strike out with your fists, you curse and you swear to make them recognize you. And, alas, it’s seldom successful… ~Ralph Ellison (1932), Invisible Man
In what follows, I argue that the Black Lives Matter movement is a hacker group, glitching the social program in ways that disrupt white supremacy with glimpses of race consciousness. It is a group that combats black Americans’ invisibility; that “bumps back” until finally, they are recognized. As Ellison continues: more...
Authenticity is a tricky animal, and social media complicate the matter. Authenticity is that which seems natural, uncalculated, indifferent to external forces or popular opinion. This sits in tension with the performativity of everyday life, in which people follow social scripts and social decorum, strive to be likeable—or at least interesting—and constantly negotiate the expectations of ever expanding networks. The problem of performance is therefore to pull it off as though unperformed. The nature of social media, with its built-in pauses and editing tools, throw the semblance of authenticity into a harsh light. Hence, the widespread social panics about a society whose inhabitants are disconnected from each other and disconnected from their “true selves.”
For political campaigns, the problem of authenticity is especially sharp. Politicians are brands, but brands that have to make themselves relatable on a very human level. This involves intense engagement with all forms of available media, from phone calls, to newspaper ads and editorials, to talk show appearances, television interviews and now, a social media presence. The addition of social media, along with the larger culture of personalization it has helped usher in, means that political performances must include regular backstage access. Media consumers expect politicians to be celebrities, expect celebrities to be reality stars, and expect reality stars to approximate ordinary people, but with an extra dab of panache. The authentic politician, then, must be untouchable and accessible, exquisite and mundane, polished yet unrehearsed. Over the last couple of elections, social media has been the primary platform for political authenticity. Candidates give voters access to themselves as humans—not just candidates—but work to do so in a way that makes them optimally electable. It’s a lot of work to be so meticulously authentic. more...
In 1953, Hugh Hefner invited men between the ages of “18 and 80” to enjoy their journalism with a side of sex. It was Playboy’sinaugural issue, featuring Marylyn Monroe as the centerfold, and it launched an institution that reached behind drugstore counters, onto reality TV, and under dad’s mattresses. It was racy and cutting edge and ultimately, iconic. Posing for Playboy was a daring declaration of success among American actresses and the cause of suspension for a Baylor University student [i]. But edges move, and today, Playboy vestiges can be found on the Food Network.
In August, Playboy stopped showing nude images on their website. The New York Times reports that viewership subsequently increased from 4 million to 16 million. That’s fourfold growth!! In what can only be described as good business sense, the company announced that in March, they will stop including nude women in their magazine as well. Putting clothes on appears surprisingly profitable. more...
Why don’t we ever talk about taking over social media companies? We will boycott them, demand transparency measures, and even build entire alternative networks based on volunteer labor but no one ever seems to consider taking all the servers and data sets away from the Mark Zuckerbergs of the world and putting it all in the hands of the users. Even if a company was doing a bang-up job making their products easier to use, freer from harassment, and more productive in creating a better society, there’s still something fundamentally creepy about users having no democratic control over such an important aspect of their lives. Why is there no insistence that such important technologies have democratic accountability? Why are we so reticent to demand direct control over the digital aspects of our lives? more...
Before leaving his post as Australia’s Education Minister, Christopher Pyne approved a major restructuring of the public school curriculum. The new plan makes code and programming skills central. In a statement released by Australia’s Department of Education and Training at the end of September, Pyne laid out plans to disperse $12million for:
The development of innovative mathematics curriculum resources.
Supporting the introduction of computer coding across different year levels.
Establishing a P-TECH-style school pilot site.
Funding summer schools for STEM students from underrepresented groups.
From grade 5, students will learn how to code. From grade 7, they will learn programming. What they will no longer be required to learn, however, is history and geography. The new plan replaces these heretofore core subjects with the technical skills of digital innovation. more...
At this point everyone is undoubtedly aware of the school shooting at Umpqua Community College in Oregon, though I am certain the method by which we came across the news varied. Some of us were notified of the event by friends with fingers closer to the pulse, and still more of us learned about it first-hand through a variety of content aggregation platforms, such as Reddit.
I would hazard that the majority of people learned about it first and foremost through social media; primarily Facebook or Twitter. I certainly did. A friend first told me about it through Facebook Messenger, and almost immediately after she did, I started to see articles trickling into my newsfeed in the background of my Messenger window. And the moment that happened I shut my Facebook tab down, despite the fact that I compulsively, addictively, have it open almost all the time.
Facebook, when taken on the whole, is a fantastic way for people to compare each others’ lives and share pictures of kittens and children, but when it comes to a tragedy, the platform is woefully inadequate at allowing its users to parse and process the gravity of events as they unfold. It is a thorough shock to a system that thrives on irreverent links and topical memes, and when faced with an item that requires genuine reflection and thought, it is often simpler – indeed, even more beneficial – for users to turn their heads until the kittens may resume. more...
The New York Times editors, as Claude Fisher wrote yesterday, “have their meme and they will ride it hard.” That meme is Sherry Turkle, the MIT psychologist that has built a cottage industry (a far away disconnected cottage on the shores of Cape Cod no doubt) around pathologizing the bad feelings people get when everyone around them are on their phones. Fisher does a really supurb job of laying out what is wrong with this latest round of Turkle fanfare so you should go read his piece on his blog, but I want to draw out and add to one point that he makes about the “death of conversation” being an evergreen topic for decades.
I have an article coming out in First Monday in about a month but there is a section that I want to quote from just because I think it is especially relevant to this issue of conversation, attention, and their vulnerability to new technologies. The article argues that online/offline states should be seen as social relationships among groups and not the binary states of an individual. To that point I show how cultural, political, and economic reactions to railroad lines mirror the experiences we have with the Internet today. What follows is a small section about what sorts of social and cultural effects were attributed to railroads: more...
I’d like to offer a friendly rebuttal to Jenny Davis’ recent essay in which she argues against the use of trigger warnings in favor of other signifiers of content that may cause people to relive trauma in unproductive ways. Davis proposes “an orientation towards audience intentionality among content producers, content distributors, and platform designers” as a potential path out of the mire of trigger warning debates, such as ending the use of clever titles that mask potentially disturbing content or doing away with autoplay. I think we can all agree that autoplay needs to go. Seriously, please stop forever.
I agree with Davis that trigger warnings don’t do enough; they often fail to take into account the wide variety of trauma triggers and leave content producers and distributers in the position of mind reading to predict what content will be triggering for whom. But the same argument can be made for nearly every social convention designed to mitigate harm. Laws against hate crimes will be unevenly enforced, warnings on labels will leave out chemicals not yet known to be harmful, and seatbelts will not prevent all automobile accident deaths. Yet the fact that someone will consume more than three alcoholic drinks per day while taking ibuprofen does not spur vigorous debate around the utility of the warning.
We put up with half measures all the time, in nearly every facet of social life. So why are trigger warnings so divisive, as Davis rightly points out? She compares the use of trigger warnings to picking a fight, but who fired the first shot? If using a trigger warning on content is a political decision, and I agree that it is, what are its politics? Davis argues that trigger warnings tell “a contingent of consumers to go screw.” But why are trigger warnings read that way by so many people? To my mind, there is nothing obviously offensive about writing “TW: anti-black violence” at the beginning of an essay or before an item on a syllabus. There is no clear reason why someone might read that phrase and be so turned off to the content itself that they refrain from reading it—unless, of course, they have experienced some trauma that gives them cause to.
Trigger warnings often spur disagreement that descends into a discussion of the warning itself, rather than the content. But what causes this divisiveness in the first place? I believe that trigger warnings are divisive because they suggest that we are responsible to each other to foster an environment of mutual respect, because they demand that we empathize with individuals in ways that, as Davis points out, we cannot predict or imagine. This responsibility and act of empathy is absolutely counter to the dominant neoliberal paradigm of individual responsibility and unmitigated competition that informs so much of our social imaginary. The idea that your pain might be, at least in part, my fault because I failed to do something as basic as type a few extra words in a syllabus is anathema to a culture that expects us all to train ourselves to be savvy consumers with bootstraps made for pulling.
Davis argues that trigger warnings often turn into “a self-congratulatory monologue” and, as with so much of progressive politics, she is absolutely right. She also argues that there is a slippery slope lurking on the horizon in which all content will be tagged with so many trigger warnings that they simply become noise; again, I agree. These are problems to contend with, but they are not so insurmountable that we have to dispose completely with the trigger warning as a tool which is limited, but still useful. Self-congratulatory monologues are unavoidable, but not a problem inherent to trigger warnings. Useful content often proliferates to the point of being mere noise but, frankly, I don’t see that happening any time soon with regard to trigger warnings. They are still very rarely used in mainstream news outlets, classroom syllabi, or literary works.
A frequent argument against the use of trigger warnings is that they are patronizing or, as Davis says, “paternalistic.” Now, patronizing generally means condescending, imposed from above by someone in authority. But it is those who have experienced trauma that demand trigger warnings in the first place. So who is patronizing whom? I argue that it is, in fact, patronizing to assert that individuals who have experienced trauma should prioritize the annoyance, or even hostility, of those who dislike trigger warnings above their own needs regarding mental wellness.
Early in her essay Davis asserts that “we all agree that people who have experienced trauma should not have to endure further emotional hardship in the midst of a class session nor while scrolling through their friends’ status updates.” However, many of the essays arguing against trigger warnings argue exactly the opposite. For example, this much-circulated essay in The Atlantic discusses exposure therapy and argues that students with PTSD should be exposed to their triggers in a classroom setting since “the world beyond college will be far less willing to accommodate requests for trigger warnings and opt-outs.” The author fails to point out, however, that exposure therapy usually takes place under relatively controlled conditions and in increments; in other words, definitely not a classroom environment.
Thus far I’ve tried to speak from my perspective as a scholar, a teacher, and as someone who is careful about how they share content on social media. I’d like to break from that perspective briefly and, if you’ll indulge me, speak as someone who has experienced trauma. I have had entire days robbed from me in the course of reading online because an essay or video failed to prepare me for triggering content. It is not merely a question of being unsettled, sad, or disturbed. For many, whether they are diagnosed with PTSD or not, it is a question of losing bodily autonomy, of descending into a panic attack and hyperventilating until you cannot use your hands or feet. It can result in more than just lost sleep or skipped meals, but self-harming and even suicidal behavior. When I see a trigger warning, I do not feel patronized—I feel respected. I am being given the informed choice to consume or not consume content that may rob me of my piece of mind. Davis ends her essay by saying “perhaps the best way we can care for one another is by helping and trusting each person to care for hirself.” This, I argue, is exactly what trigger warnings do by design. They don’t censor content (it’s still there!), they label it in a way that those most affected by it have found, and continue to argue, is liberatory.
Britney Summit-Gil is a graduate student in Communication and Media at Rensselaer Polytechnic Institute. She tweets occasionally at @beersandbooks.
I am sick of talking about trigger warnings. I think a lot of people are. The last few months have seen heated debates and seemingly obligatory positionstatements streaming through RSS and social media feeds. I even found a piece that I had completely forgotten I wrote until I tried to save this document under the same name (“trigger warning”). Some of these pieces are deeply compelling and the debates have been invaluable in bringing psychological vulnerability into public discourse and positioning mental health as a collective responsibility. But at this point, we’ve reached a critical mass. Nobody is going to change anyone else’s mind. Trigger warning has reached buzzword status. So let’s stop talking about trigger warnings. Seriously. However, let’s absolutely not stop talking about what trigger warnings are meant to address: the way that content can set off intense emotional and psychological responses and the importance of managing this in a context of constant data streams.
I’m going to creep out on a limb and assume we all agree that people who have experienced trauma should not have to endure further emotional hardship in the midst of a class session nor while scrolling through their friends’ status updates. Avoiding such situations is an important task, one that trigger warnings take as their goal. Trigger warnings, however, are ill equipped for the job. more...
We live in a cyborg society. Technology has infiltrated the most fundamental aspects of our lives: social organization, the body, even our self-concepts. This blog chronicles our new, augmented reality.