IMAGE1One of the most prominent theorists of the late 20th century, Michel Foucault, spent a career asking his history students to let go of the search for the beginning of an idea. “Origins” become hopelessly confused and muddled with time; they gain accretions that ultimately distort any pure search for the past on the terms of the past. Instead, his alternative was to focus on how these accretions distorted the continuity behind any idea. This method was called “genealogy,” by Nietzsche, and Foucault’s essay expanded on its use. Dawn Shepherd captured the significance of this lesson in a beautiful, single sentence: “Before we had ‘netflix and chill ;)’ we just had ‘netflix and chill.’”

The temptation with something as recent as the web is to emphasize the web’s radical newness. Genealogy asks that we resist this demand and instead carefully think about the web’s continuity with structures far older than the web itself. While genealogy is not about the origins of “chill,” genealogy emphasizes the continuity of “chill.” Genealogy must build from an idea of what “chilling” entailed to say something about what “chill” means now.

Conversations about these continuities animated many of the conversations at Theorizing the Web 2016. Both the keynote panels and regular sessions asked audiences to imagine the web as part of society, rather than outside of it. In the words of its founders, the original premise of the conference was “to understand the Web as part of this one reality, rather than as a virtual addition to the natural.”
more...

Snapchat_blackface1

Outrage over the Bob Marley Snapchat filter was swift following its brief appearance on the mobile application’s platform on April 20 (The 420 pot smoking holiday). The idea of mimicking Bob Marley in appreciation of a day dedicated to smoking marijuana enabled users to don the hat, dreads, and…blackface!? News outlets that day covered the issue pretty quickly. CNN.money and The Verge noted the negative reactions voiced on social media in regard to the filter. Tech publisher Wired released a brief article condemning it, calling it racially tone-deaf.

The racial implications of the Bob Marley filter are multifaceted, yet I would like to focus on the larger cultural logic occurring both above and behind the scenes at an organization like Snapchat. The creation of a filter that tapped into blackface iconography demonstrates the complexity of our relationship to various forms of technology – as well as how we choose to represent ourselves through those technologies. French sociologist Jacques Ellul wrote in The Technological Society of ‘technique’ as an encompassing train of thought or practice based on rationality that achieves its desired end. Ellul spoke of technique in relation to advances in technology and human affairs in the aftermath of World War II, yet his emphasis was not on the technology itself, but rather the social processes that informed the technology. This means that in relation to a mobile application like Snapchat we bring our social baggage with us when we use it, and so do developers when they decide to design a new filter. Jessie Daniels addresses racial technique in her current projects regarding colorblind racism and the internet – in which the default for tech insiders is a desire to not see race. This theoretically rich work pulls us out of the notion that technology is neutral within a society that has embedded racial meanings flowing through various actors and institutions, and where those who develop the technology we use on a daily basis are unprepared to acknowledge the racial disparities which persist, and the racial prejudice that can—and does—permeate their designs. more...

bitcoin-blockchain

I only heard the term “blockchain technology” for the first time this past autumn, but in the last few months, I’ve became pretty absorbed in the blockchain world. Initially I was intimidated by its descriptions, which struck me as needlessly abstruse — though, in a perfect chicken-and-egg scenario, I couldn’t be sure, since the descriptions didn’t offer an easy understanding of how it worked. What compelled me to press on in my blockchain research was the terminology surround it. I’m a long-standing advocate for open source, and blockchain’s default descriptors are “distributed” (as in “distributed ledger”) “decentralized” (as in “decentralized platform,” a tagline for at least one major blockchain development platform [1: https://www.ethereum.org/])  and “peer-to-peer” ( the crux of all things Bitcoin and blockchain). These words all spoke to my f/oss-loving heart, leading me to click on article after jargon-heavy article in an effort to wrap my head around the ‘chain. As I learned more about it, I came to understand why it’s begun to garner huge amounts of attention. I don’t like to get too starry-eyed about a new technology, but I too became a blockchain believer.

more...

bivens

Almost two years ago, Facebook waved the rainbow flag and metaphorically opened its doors to all of the folks who identify outside of the gender binary. Before Facebook announced this change in February of 2014, users were only able to select ‘male’ or ‘female.’ Suddenly, with this software modification, users could choose a ‘custom’ gender that offered 56 new options (including agender, gender non-conforming, genderqueer, non-binary, and transgender). Leaving aside the troubling, but predictable, transphobic reactions, many were quick to praise the company. These reactions could be summarized as: ‘Wow, Facebook, you are really in tune with the LGBTQ community and on the cutting edge of the trans rights movement. Bravo!’ Indeed, it is easy to acknowledge the progressive trajectory that this shift signifies, but we must also look beyond the optics to assess the specific programming decisions that led to this moment.

To be fair, many were also quick to point to the limitations of the custom gender solution. For example, why wasn’t a freeform text field used? Google+ also shifted to a custom solution 10 months after Facebook, but they did make use of a freeform text field, allowing users to enter any label they prefer. By February of 2015, Facebook followed suit (at least for those who select US-English).

There was also another set of responses with further critiques: more granular options for gender identification could entail increased vulnerability for groups who are already marginalized. Perfecting your company’s capacity to turn gender into data equates to a higher capacity for documentation and surveillance for your users. Yet the beneficiaries of this data are not always visible. This is concerning, particularly when we recall that marginalization is closely associated with discriminatory treatment. Transgender women suffer from disproportionate levels of hate violence from police, service providers, and members of the public, but it is murder that is increasingly the fate of people who happen to be both trans and women of color.

Alongside these horrific realities, there is more to the story – hidden in a deeper layer of Facebook’s software. When Facebook’s software was programmed to accept 56 gender identities beyond the binary, it was also programmed to misgender users when it translated those identities into data to be stored in the database. In my recent article in New Media & Society, ‘The gender binary will not be deprogrammed: Ten years of coding gender on Facebook,’ I expose this finding in the midst of a much broader examination of a decade’s worth of programming decisions that have been geared towards creating a binary set of users. more...

Lanius

In December, the Center for Data Innovation sent out an email titled, “New in Data: Big Data’s Positive Impact on Underserved Communities is Finally Getting the Attention it Deserves” containing an article by Joshua New. New recounts the remarks of Federal Trade Commissioner Terrell McSweeny at a Google Policy Forum in Washington, D.C. on data for social empowerment, which proudly lists examples of data-doing-good. As I read through the examples provided of big data “serving the underserved”, I was first confused and then frustrated. Though to be fair, I went into it a little annoyed by the title itself.

The idea of big data for social good is not “new in data”: big data have been in the news and a major concern for research communities around the world since 2008. One of the primary justifications, whether spoken or implicit, is that data will solve all of humanities biggest crises. Big data do not “deserve” attention: they are composed of inanimate objects without needs or emotions. And, while big data are having an “impact” on underserved communities, it is certainly not the unshakably positive, utopian impact that the title promises. more...

Henri_Maillardet_automaton,_London,_England,_c._1810_-_Franklin_Institute_-_DSC06656

In the past few years, a subgenre of curiously self-referential feature stories and opinion pieces has begun to appear in many prominent magazines and newspapers. The articles belonging to this subgenre all respond to the same phenomenon – the emergence of natural language generation (NLG) software that has been composing news articles, quarterly earnings reports, and sports play-by-plays – but what they really have in common is a question on the part of the writer: “Am I doing anything that an algorithm couldn’t be doing just as well?” In some instances, titles like “If an Algorithm Wrote This, How Would You Even Know?” and “Can an Algorithm Write a Better News Story Than a Human Reporter?” place the author’s uniqueness in question from the outset; sometimes the authors of these pieces also force their readers to wonder whether the text they are reading was written by human or machine. In a New York Times Sunday Review essay from last year, for instance, Shelley Podolny subjects her readers to a mini-Turing Test, presenting two descriptions of a sporting event, one written by a software program and one written by a human, and asking us to guess which is which. (On the Times website, readers can complete an interactive quiz in which they deduce whether a series of passages were composed by automated or human authors.)

The two major companies involved in the development of algorithmic writing, Automated Insights and Narrative Science, have been around since 2007 and 2010, respectively. Narrative Science’s flagship software product is called Quill, while Automated Insights’s is called Wordsmith: quaint, artisanal names for software that seems to complete the long process that has severed the act of writing from the human hand over the past century and a half. The two companies initially developed their programs to convert sports statistics into narrative texts, but quickly began offering similar services to companies and later started expanding into data-driven areas of journalism. Such data-based reporting is what NLG software does well: it translates numerical representations of information into language-based representations of the same information. And while NLG programs have existed for several decades, they were mostly limited to producing terse reports on a limited range of subjects, like weather events and seismic activity. According to Larry Birnbaum, one of Quill’s architects, “Computers have known how to write in English for years. The reason they haven’t done so in the past is they had nothing to say, lacking access to a sufficient volume of information.”

As Birnbaum explains it, the new natural language generation software has been made possible – or rather, necessary – by the advent of Big Data. The prior limitations on the topics software programs could write about are disappearing, as all realms of human activity become subject to data processing. Joe Fassler notes in The Atlantic that “the underlying logic that drives [algorithmic writing] – scan a data set, detect significance, and tell a story based on facts – is powerful and vastly applicable. Wherever there is data . . . software can generate a prose analysis that’s robust, reliable, and readable.” Hence, automated journalism will continue to expand into less obviously data-driven realms of reporting as new sources of data become available for processing. Meanwhile, the Associated Press and Forbes, to name a few, are already publishing thousands of software-written articles. more...

 

SpecialGameProtagonistScreenshot

Fallout 4 tells me that I am special.

At the start of the game, I am prompted to assign point values to Strength, Perception, Endurance, Charisma, Intelligence, Agility, and Luck (yes, that spells SPECIAL) as an initial step towards the crafting of my customized protagonist. These statistics form the foundation of my character’s abilities, skills, and know-how. I will build on them and further specify them in the course of my play.

But Fallout 4 tells me that I am special in other ways, namely through the ways that it positions my protagonist within its narrative. My character is the lone survivor of a fallout shelter following a devastating nuclear war. She is cryogenically frozen, but wakes from her sleep long enough to witness her husband murdered and her infant son kidnapped. When she emerges from the vault 200 years after first entering it, she’s on a mission to find her son, despite having no knowledge of when the kidnapping happened.

Somehow, though, the local populace of wasteland Boston quickly determines that she exhibits exceptional leadership and combat skills. So they name her General, task her with the responsibility of restoring a floundering militia group, and put her at the head of rebuilding a new settlement and ultimately uniting the Commonwealth. Thus, immediately after emerging from a 200-year sleep during which time the world as she knew it was destroyed, my affluent-professional-suburban-Boston-wife-mother character is able to navigate a hostile irradiated wasteland, find resources on her own, master a particular fighting prowess, and then convince a straggling group of survivors to make her their leader. Soon enough she’s binding other settlements to her cause and gradually seizing power over the Commonwealth. more...

 Front page of one of Columbia’s local papers the day after the resignations
Front page of one of Columbia’s local papers the day after the resignations

The story emerged for me two Thursdays ago, when a colleague at the University of Missouri, where I work, asked if I wanted to accompany her to find a march in support of Jonathan Butler, a graduate student on hunger strike with demands that president Tim Wolfe resign over his inaction towards racism on campus. We encountered the protest as it moved out of the bookstore and followed it into the Memorial Union, where many students eat lunch. This was the point at which I joined the march and stuck with it across campus, into Jesse Hall, and finally to Concerned Student 1950’s encampment on the quad where the march concluded. Since then I’ve been trying to read up on what led up to this march, sharing what I find as I go. This task became much easier after Wolfe’s announcement on Monday that he would resign, and the national media frenzy that followed. At first, however, learning about the march that I had participated in proved far more difficult than I expected. more...

The author's home antenna
The author’s home antenna

I moved to rural Kansas a over a year ago. I live beyond Lawrence city limits, on the outskirts of Stull (where local legend places one of the gateways to hell), and 50 minutes driving to the nearest Google Fiber connection. It’s a liminal space in terms of broadband connection – the fastest network in the country is being built in the neighboring metropolitan area but when I talked to my neighbors about internet service providers in our area, they were confused by my quest for speeds higher than 1mbps. As this collection of essays on “small town internet” suggests, there’s an awareness that internet in rural, small town, and “remote” places exists, but we need to understand more about how digital connection is incorporated (or not) into small town and rural life: how it’s used, and what it feels like to use it.

One of my ongoing projects involves researching digital divides and digital inclusion efforts in Kansas City. The arrival of Google Fiber in Kansas City, KS and Kansas City, MO has provided increased momentum and renewed impetus for recognition of digital divides based on cost, access, education and computer literacy, relevance, mobility, and more discussion and visibility for organizations and activists hoping to alleviate some of these divides and emphasize internet access as a utility. I’ve argued that by reading digital media in relationship to experiences of “place,” we gain a more holistic and nuanced understanding of digital media use and non-use, processes and decisions around implementation and adoption, and our relationships to digital artifacts and infrastructures. In other words, one’s location and sense of place become important factors in shaping practices, decisions, and experiences of digital infrastructure and digital media.

The irony is not lost on me that while studying digital divides in a metropolitan area, I had chosen to live in a location with its own, unique series of inequities in terms of internet connection. These inequities have nothing to do with socio-economic instability or lack of digital literacy, as I had funds and willingness to pay a significant amount for internet service (comparable to the prices charged by urban-based, corporate ISPs), and everything to do with the fact that I lived in an area that felt as if it had been forgotten or intentionally bypassed by the internet service providers (ISPs) I had come to know living in other US cities and towns. more...

Greenville College

Small towns move at the rate of horse and buggy rather than high speed internet, and therefore tend to reside on the wrong side of the digital divide. However, digital divides are not fixed or homogeneous, and small towns can surprise you. This is made clear through the case of Greenville College.

Out far from the glow of St. Louis is the small rural community of Greenville, Illinois. Greenville is a negligible town of 7,000. Most pass it on the interstate without even noticing–or use it as a place to go to the bathroom on the way from St. Louis toward Indianapolis. Amidst its miniscule population is a small enclave of higher-ed: Greenville College. Greenville College, founded in 1892, is a small Christian liberal arts college. Greenville College was once on the unfortunate side of the digital divide–until, out of necessity, it surpassed its urban counterparts. more...