guest author

Thiel - Girard

During the week of July 12, 2004, a group of scholars gathered at Stanford University, as one participant reported, “to discuss current affairs in a leisurely way with [Stanford emeritus professor] René Girard.” The proceedings were later published as the book Politics and Apocalypse. At first glance, the symposium resembled many others held at American universities in the early 2000s: the talks proceeded from the premise that “the events of Sept. 11, 2001 demand a reexamination of the foundations of modern politics.” The speakers enlisted various theoretical perspectives to facilitate that reexamination, with a focus on how the religious concept of apocalypse might illuminate the secular crisis of the post-9/11 world.

As one examines the list of participants, one name stands out: Peter Thiel, not, like the rest, a university professor, but (at the time) the President of Clarium Capital. In 2011, the New Yorker called Thiel “the world’s most successful technology investor”; he has also been described, admiringly, as a “philosopher-CEO.” More recently, Thiel has been at the center of a media firestorm for his role in bankrolling Hulk Hogan’s lawsuit against Gawker, which outed Thiel as gay in 2007 and whose journalists he has described as “terrorists.” He has also garnered some headlines for standing as a delegate for Donald Trump, whose strongman populism seems an odd fit for Thiel’s highbrow libertarianism; he recently reinforced his support for Trump with a speech at the Republican National Convention. Both episodes reflect Thiel’s longstanding conviction that Silicon Valley entrepreneurs should use their wealth to exercise power and reshape society. But to what ends? Thiel’s participation in the 2004 Stanford symposium offers some clues. more...

57958534_3d8c9b4a2c_z

One of the first news stories about the June 12th Orlando shooting that I read focused on the mother of a young man trapped inside Pulse nightclub, and the text messages that she had exchanged with her son. When I first read the story, the fate of the young man was not yet known, although his text messages had ceased by 3am, and his mother was quoted as having a “bad feeling” about the outcome. That day, as the names of the victims trickled out, I followed the news intently, hoping that somehow this young man’s name would not appear on the list of the deceased. But it did.

Like so many others across the country and the world in the wake of the Orlando massacre, I experienced an intense form of empathy for the victims and their families, made possible in part by increasingly timely and intimate forms of news gathering in the digital age. I read the news from a position of safety and security, but still felt that empty pit in my stomach, still had to stop in my tracks as the young man’s name came across my constantly updating Twitter feed. Millions of others felt something similar. But what becomes of all this empathy? more...

IMG_5207

As I walk in, I glance at the sign that says “Local 1964 AFL-CIO” and smile at the greeter who stands by the front door. Walking down the aisle I peruse the household cleaners. All natural, $4; organic, $9; wood-safe, $3.Taken slightly aback by the cost, I pull out my phone and bring up the same products in the Amazon app: $1.50 difference. I pause to think about that $1.50 and spiral into an internal dialogue about capitalism and my place within it.  Not only do these chemicals eventually go down my drain back into a water source that I share with my neighbors, but the money I spent represents my values. $1.50 is less than the price of a small ice cream, and also, apparently, the margin necessary to pay the local grocery workers a living wage.  I pick up the product off the shelf— the $4 natural multi-surface cleaner—and continue down my list.

The presence and use of Amazon’s apps on my phone are part of a larger socioeconomic process. As gentrification runs rampant in my small city, the local use of Amazon and other delivery apps increases and local union grocery stores have closed one after another, resulting in many unemployed people. One of the many insidious aspects of late capitalism is its ability to force a competition between time-saving and wage-saving.  The convenience of technology necessitates further trust in and reliance on the rest of society. Or, as PJ Rey puts it: “There is no such thing as a lone cyborg.” What we often ignore, however, is how our choice of convenience simultaneously necessitates that our local community also become more reliant on large infrastructures and less self-sustaining. As Christian Fuchs explains in Labor in Informational Capitalism and on the Internet, the unemployed class is an inevitable byproduct of technological progress in a capitalist society that must be continuously deprived of wage labor and capital. more...

IMAGE1One of the most prominent theorists of the late 20th century, Michel Foucault, spent a career asking his history students to let go of the search for the beginning of an idea. “Origins” become hopelessly confused and muddled with time; they gain accretions that ultimately distort any pure search for the past on the terms of the past. Instead, his alternative was to focus on how these accretions distorted the continuity behind any idea. This method was called “genealogy,” by Nietzsche, and Foucault’s essay expanded on its use. Dawn Shepherd captured the significance of this lesson in a beautiful, single sentence: “Before we had ‘netflix and chill ;)’ we just had ‘netflix and chill.’”

The temptation with something as recent as the web is to emphasize the web’s radical newness. Genealogy asks that we resist this demand and instead carefully think about the web’s continuity with structures far older than the web itself. While genealogy is not about the origins of “chill,” genealogy emphasizes the continuity of “chill.” Genealogy must build from an idea of what “chilling” entailed to say something about what “chill” means now.

Conversations about these continuities animated many of the conversations at Theorizing the Web 2016. Both the keynote panels and regular sessions asked audiences to imagine the web as part of society, rather than outside of it. In the words of its founders, the original premise of the conference was “to understand the Web as part of this one reality, rather than as a virtual addition to the natural.”
more...

Snapchat_blackface1

Outrage over the Bob Marley Snapchat filter was swift following its brief appearance on the mobile application’s platform on April 20 (The 420 pot smoking holiday). The idea of mimicking Bob Marley in appreciation of a day dedicated to consuming marijuana by smoking it or consuming it in the form of a gummy bear or brownie, enabled users to don the hat, dreads, and…blackface!? News outlets that day covered the issue pretty quickly. CNN.money and The Verge noted the negative reactions voiced on social media in regard to the filter. Tech publisher Wired released a brief article condemning it, calling it racially tone-deaf.

The racial implications of the Bob Marley filter are multifaceted, yet I would like to focus on the larger cultural logic occurring both above and behind the scenes at an organization like Snapchat. The creation of a filter that tapped into blackface iconography demonstrates the complexity of our relationship to various forms of technology – as well as how we choose to represent ourselves through those technologies. French sociologist Jacques Ellul wrote in The Technological Society of ‘technique’ as an encompassing train of thought or practice based on rationality that achieves its desired end. Ellul spoke of technique in relation to advances in technology and human affairs in the aftermath of World War II, yet his emphasis was not on the technology itself, but rather the social processes that informed the technology. This means that in relation to a mobile application like Snapchat we bring our social baggage with us when we use it, and so do developers when they decide to design a new filter. Jessie Daniels addresses racial technique in her current projects regarding colorblind racism and the internet – in which the default for tech insiders is a desire to not see race. This theoretically rich work pulls us out of the notion that technology is neutral within a society that has embedded racial meanings flowing through various actors and institutions, and where those who develop the technology we use on a daily basis are unprepared to acknowledge the racial disparities which persist, and the racial prejudice that can—and does—permeate their designs. more...

bitcoin-blockchain

I only heard the term “blockchain technology” for the first time this past autumn, but in the last few months, I’ve became pretty absorbed in the blockchain world. Initially I was intimidated by its descriptions, which struck me as needlessly abstruse — though, in a perfect chicken-and-egg scenario, I couldn’t be sure, since the descriptions didn’t offer an easy understanding of how it worked. What compelled me to press on in my blockchain research was the terminology surround it. I’m a long-standing advocate for open source, and blockchain’s default descriptors are “distributed” (as in “distributed ledger”) “decentralized” (as in “decentralized platform,” a tagline for at least one major blockchain development platform [1: https://www.ethereum.org/])  and “peer-to-peer” ( the crux of all things Bitcoin and blockchain). These words all spoke to my f/oss-loving heart, leading me to click on article after jargon-heavy article in an effort to wrap my head around the ‘chain. As I learned more about it, I came to understand why it’s begun to garner huge amounts of attention. I don’t like to get too starry-eyed about a new technology, but I too became a blockchain believer. Crypto currencies like Bitcoin and the many others springing up are entirely digital and, as with any virtual system, are susceptible to hackers, malware and operational glitches. The bitcoin billionaire is an automatic cryptocurrency trading platform. Bitcoin Loophole platform is a powerful, efficient, reliable software that offers manual and automated cryptocurrency trading through a user-friendly interface. You can easily earn money as the award-winning bitcoin loophole app delivers unmatched results. It is legitimate, accredited, safe and secure. Join the exclusive group of members who recognized the potential in Bitcoin Trading and seized the opportunity to turn their lives around, by investing a minimum of time, effort and funds. The DC Forecasts – Cryptocurrency News Team can help you understand what challenges and opportunities are in the offing for the crypto world. The team consists of highly skilled writers and editors who are experts in the crypto sphere.

more...

bivens

Almost two years ago, Facebook waved the rainbow flag and metaphorically opened its doors to all of the folks who identify outside of the gender binary. Before Facebook announced this change in February of 2014, users were only able to select ‘male’ or ‘female.’ Suddenly, with this software modification, users could choose a ‘custom’ gender that offered 56 new options (including agender, gender non-conforming, genderqueer, non-binary, and transgender). Leaving aside the troubling, but predictable, transphobic reactions, many were quick to praise the company. These reactions could be summarized as: ‘Wow, Facebook, you are really in tune with the LGBTQ community and on the cutting edge of the trans rights movement. Bravo!’ Indeed, it is easy to acknowledge the progressive trajectory that this shift signifies, but we must also look beyond the optics to assess the specific programming decisions that led to this moment.

To be fair, many were also quick to point to the limitations of the custom gender solution. For example, why wasn’t a freeform text field used? Google+ also shifted to a custom solution 10 months after Facebook, but they did make use of a freeform text field, allowing users to enter any label they prefer. By February of 2015, Facebook followed suit (at least for those who select US-English).

There was also another set of responses with further critiques: more granular options for gender identification could entail increased vulnerability for groups who are already marginalized. Perfecting your company’s capacity to turn gender into data equates to a higher capacity for documentation and surveillance for your users. Yet the beneficiaries of this data are not always visible. This is concerning, particularly when we recall that marginalization is closely associated with discriminatory treatment. Transgender women suffer from disproportionate levels of hate violence from police, service providers, and members of the public, but it is murder that is increasingly the fate of people who happen to be both trans and women of color.

Alongside these horrific realities, there is more to the story – hidden in a deeper layer of Facebook’s software. When Facebook’s software was programmed to accept 56 gender identities beyond the binary, it was also programmed to misgender users when it translated those identities into data to be stored in the database. In my recent article in New Media & Society, ‘The gender binary will not be deprogrammed: Ten years of coding gender on Facebook,’ I expose this finding in the midst of a much broader examination of a decade’s worth of programming decisions that have been geared towards creating a binary set of users. more...

Lanius

In December, the Center for Data Innovation sent out an email titled, “New in Data: Big Data’s Positive Impact on Underserved Communities is Finally Getting the Attention it Deserves” containing an article by Joshua New. New recounts the remarks of Federal Trade Commissioner Terrell McSweeny at a Google Policy Forum in Washington, D.C. on data for social empowerment, which proudly lists examples of data-doing-good. As I read through the examples provided of big data “serving the underserved”, I was first confused and then frustrated. Though to be fair, I went into it a little annoyed by the title itself.

The idea of big data for social good is not “new in data”: big data have been in the news and a major concern for research communities around the world since 2008. One of the primary justifications, whether spoken or implicit, is that data will solve all of humanities biggest crises. Big data do not “deserve” attention: they are composed of inanimate objects without needs or emotions. And, while big data are having an “impact” on underserved communities, it is certainly not the unshakably positive, utopian impact that the title promises. more...

Henri_Maillardet_automaton,_London,_England,_c._1810_-_Franklin_Institute_-_DSC06656

In the past few years, a subgenre of curiously self-referential feature stories and opinion pieces has begun to appear in many prominent magazines and newspapers. The articles belonging to this subgenre all respond to the same phenomenon – the emergence of natural language generation (NLG) software that has been composing news articles, quarterly earnings reports, and sports play-by-plays – but what they really have in common is a question on the part of the writer: “Am I doing anything that an algorithm couldn’t be doing just as well?” In some instances, titles like “If an Algorithm Wrote This, How Would You Even Know?” and “Can an Algorithm Write a Better News Story Than a Human Reporter?” place the author’s uniqueness in question from the outset; sometimes the authors of these pieces also force their readers to wonder whether the text they are reading was written by human or machine. In a New York Times Sunday Review essay from last year, for instance, Shelley Podolny subjects her readers to a mini-Turing Test, presenting two descriptions of a sporting event, one written by a software program and one written by a human, and asking us to guess which is which. (On the Times website, readers can complete an interactive quiz in which they deduce whether a series of passages were composed by automated or human authors.)

The two major companies involved in the development of algorithmic writing, Automated Insights and Narrative Science, have been around since 2007 and 2010, respectively. Narrative Science’s flagship software product is called Quill, while Automated Insights’s is called Wordsmith: quaint, artisanal names for software that seems to complete the long process that has severed the act of writing from the human hand over the past century and a half. The two companies initially developed their programs to convert sports statistics into narrative texts, but quickly began offering similar services to companies and later started expanding into data-driven areas of journalism. Such data-based reporting is what NLG software does well: it translates numerical representations of information into language-based representations of the same information. And while NLG programs have existed for several decades, they were mostly limited to producing terse reports on a limited range of subjects, like weather events and seismic activity. According to Larry Birnbaum, one of Quill’s architects, “Computers have known how to write in English for years. The reason they haven’t done so in the past is they had nothing to say, lacking access to a sufficient volume of information.”

As Birnbaum explains it, the new natural language generation software has been made possible – or rather, necessary – by the advent of Big Data. The prior limitations on the topics software programs could write about are disappearing, as all realms of human activity become subject to data processing. Joe Fassler notes in The Atlantic that “the underlying logic that drives [algorithmic writing] – scan a data set, detect significance, and tell a story based on facts – is powerful and vastly applicable. Wherever there is data . . . software can generate a prose analysis that’s robust, reliable, and readable.” Hence, automated journalism will continue to expand into less obviously data-driven realms of reporting as new sources of data become available for processing. Meanwhile, the Associated Press and Forbes, to name a few, are already publishing thousands of software-written articles. more...

 

SpecialGameProtagonistScreenshot

Fallout 4 tells me that I am special.

At the start of the game, I am prompted to assign point values to Strength, Perception, Endurance, Charisma, Intelligence, Agility, and Luck (yes, that spells SPECIAL) as an initial step towards the crafting of my customized protagonist. These statistics form the foundation of my character’s abilities, skills, and know-how. I will build on them and further specify them in the course of my play.

But Fallout 4 tells me that I am special in other ways, namely through the ways that it positions my protagonist within its narrative. My character is the lone survivor of a fallout shelter following a devastating nuclear war. She is cryogenically frozen, but wakes from her sleep long enough to witness her husband murdered and her infant son kidnapped. When she emerges from the vault 200 years after first entering it, she’s on a mission to find her son, despite having no knowledge of when the kidnapping happened.

Somehow, though, the local populace of wasteland Boston quickly determines that she exhibits exceptional leadership and combat skills. So they name her General, task her with the responsibility of restoring a floundering militia group, and put her at the head of rebuilding a new settlement and ultimately uniting the Commonwealth. Thus, immediately after emerging from a 200-year sleep during which time the world as she knew it was destroyed, my affluent-professional-suburban-Boston-wife-mother character is able to navigate a hostile irradiated wasteland, find resources on her own, master a particular fighting prowess, and then convince a straggling group of survivors to make her their leader. Soon enough she’s binding other settlements to her cause and gradually seizing power over the Commonwealth. more...