The editors of Jezebel did a really brave thing yesterday and called out their parent company, Gawker Media for not dealing with a very serious and persistent abuse and harassment problem. For months now, waves of violent pornography gifs have been posted to Jezebel stories using anonymous accounts untied to IP addresses or any other identifiable information. That means it’s effectively impossible to stop abusive people from posting to the site. Instead, Jezebel writers and editors have to delete the posts themselves, hopefully before too many of their readers see them. People higher up on the Gawker masthead have known about this issue and have, through inaction, forced their co-workers to look at this horrific and potentially triggering content instead of dealing with the problem. This is precisely how spaces and tools meant for everyone, turn into alienating environments that foster homogenous audiences and viewpoints. Gawker needs to help their editors defend against harassment –and fast– but they should also be thinking more comprehensively about the culture of comments.
Obviously, first and foremost, this kind of harassment needs to be recognized as the persistent problem that it truly is- not just at Gawker but in digital publishing writ large. Lindy West, a Jezebel staff writer, described this-all-too-common situation in the comments of the original post:
At pretty much every blogging job I’ve ever had, I’ve been told (by male managers) that it’d be a death sentence to moderate comments and block IP addresses, because it “shuts down discourse” and guts traffic. But no one’s ever shown me any actual numbers that support that claim. Does anyone have any? Not that I think traffic should trump employee safety anyway, but I’d love for someone to prove to me that it’s more than just a cop-out.
The false dichotomy West is describing also obscures the fact that harassment also hurts readership and most certainly “shuts down discourse.” The only difference is whom gets shut down.
When it comes to figuring out how to resolves these issues, I’m a big fan of following the lead of the person experiencing and speaking out about their oppression. They are generally the best people to fully understand the complexities of the problem at hand and we should at the very least should begin by doing what Jezebel’s staff suggests: adding a ban IP functionality to all comments on the Kinja publishing platform that Gawker utilizes. Gawker’s editorial director Joel Johnson tweeted today that they had suspended image posting all together until they figured out a more permanent solution.
In the world of Science and Technology Studies, we would call IP banning and suspending images a “tech fix.” A tech fix doesn’t get at the root of the problem (e.g. misogyny, rape culture) but it generally mitigates a severe symptom, while also bringing into stark relief the kinds of problems our society is equipped to deal with. The availability of, and ease with which a tech fix is implemented can reveal a lot about what we collectively value and ignore. So while suspending image posting is a good temporary fix and the IP ban is certainly a longer-term option, we should also give due consideration to why there aren’t lots of readily available tech fixes to this problem. Tech fixes that might, at least in this case, let Jezebel continue to benefit from anonymous commenters that are crucial for whistle-blowing and story development.
Kinja’s inability to help Jezebel staff deal with harassment is even more unforgivable when compared to all the amazing technologies we’ve developed to solve very similar (but less gendered) problems. Take, for example, spam. Junk messages don’t only show up in your firstname.lastname@example.org account, they are constantly barraging anything with a “post” or “send” button. Administrators of wikis and other interactive publishing platforms have to fight spam all the time. To an admin, spam behaves more like digital weather than an annoying business model. Offers for Oakley Sunglasses and Louis Vuitton handbags are always raining on your site and so most admins install Akismet or a captcha to discriminate between humans and a bots. There are at least a handful of really effective spam blockers for just about every platform you use on a daily basis. Why is spam so easy to block while harassment goes virtually unchecked?
Lindy West’s comment gives us one side of the troll coin: people in management positions just don’t prioritize the problem, or see it as enough of a problem to devote serious thought to dealing with. The other side of the coin, also expertly described by West, is that trolls aren’t like the weather, “internet trolling is not random—it is a sentient, directed, strong-armed goon of the status quo. And the more we can hammer that truth through the public consciousness, the sooner we can affect the widespread cultural change we need to begin tamping down online hate speech.”
West’s suggested cultural change (or Cultural Fix [PDF]) is to engage trolls with the intension of humanizing everyone in the conversation. Talking about people as if they’re monsters, and then assuming those monsters will show up in comments with the inevitability of swallows returning to Capistrano, reinforces a sexism-without-sexist people worldview. Linda Layne (in the PDF linked above) notes that in the face of seemingly inevitable problems what is needed are rituals and social mores that acknowledge the problem but help everyone recover. Gawker has shown an interest in radically rethinking how commenting technology works, but has done comparatively little to reintroduce the culture surrounding comments. Is such a campaign possible?
Gawker has a lot of money. They can experiment with a lot of options and build a campaign of campaigns. Hire someone to do nothing but filter out harassment and make judgments about threats to authors. Pay researchers to figure out how and why this kind of harassment happens in the first place and build a public media campaign to stop it. Just don’t stop at the tech fix.
When Adrien Chen tweeted a link to the Jezebel article adding “I see Gawker Media’s ‘become more like Reddit’ strategy is coming along nicely” he was being much more than glib. As I have written before, the technological affordances of a site and the culture of its user base are mutually shaping systems. A site that affords anonymity in the service of attention will always maintain hegemonic discourse. This is why, while Kinja definitely needs a better set of moderation tools we also need to pay attention to the kind of culture engendered by the rest of the site. Who feels more at home in a competition for attention? Who feels more comfortable opening up within the safe bounds of digital anonymity? We’ll have better conversations if we think about and act on these kinds of questions.