Search results for google pin

unnamed

My mom and I spent some part of the 1995 summer with my aunt and her house, complete with backyard. I was three, and having lived most of my life in a small New York studio apartment, my mom must’ve thought I would enjoy the few elements of nature often found in quiet Californian suburbs. She was wrong: each time they tried setting me in the grass, I would crawl desperately back to the beautiful, safe, concrete patio.

This is a childhood story that still speaks to my identity: camping is not my first choice of activities, and the narratives of people who lose themselves in the wilderness are  tedious to me. So it was quite a surprise when I willing accepted the hiking trail Pokémon Go had set for me with the promise of Clefairies: more...

Picture1

Throughout their history, national conventions for American political parties have become more and more public events. Closed off affairs in smoky rooms and convention halls gave way to televised roll calls and speeches. In the Year of Our Big Brother, 1984, C-SPAN aired uninterrupted coverage of the Democratic and Republican conventions. Conventions became more polished and choreographed, with 1996’s DNC being the zenith of this trend. Conventions moved to the internet in the aughts, using a variety of different platforms to distribute streams and commentary.

This election cycle incorporated something new into the dissemination of gavel-to-gavel coverage of the conventions: Twitch.tv. The platform designed for videogame streaming offered full coverage of the conventions. Additionally, it gave Twitch users the ability to host the coverage of both conventions on their own channels. In the case of the DNC, Twitch users were able to add commentary to the stream of the convention on their channel, giving their followers and other users an opportunity to hear their favorite gamers’ takes on the presentation of the Democratic National Committee. more...

IMG_5207

As I walk in, I glance at the sign that says “Local 1964 AFL-CIO” and smile at the greeter who stands by the front door. Walking down the aisle I peruse the household cleaners. All natural, $4; organic, $9; wood-safe, $3.Taken slightly aback by the cost, I pull out my phone and bring up the same products in the Amazon app: $1.50 difference. I pause to think about that $1.50 and spiral into an internal dialogue about capitalism and my place within it.  Not only do these chemicals eventually go down my drain back into a water source that I share with my neighbors, but the money I spent represents my values. $1.50 is less than the price of a small ice cream, and also, apparently, the margin necessary to pay the local grocery workers a living wage.  I pick up the product off the shelf— the $4 natural multi-surface cleaner—and continue down my list.

The presence and use of Amazon’s apps on my phone are part of a larger socioeconomic process. As gentrification runs rampant in my small city, the local use of Amazon and other delivery apps increases and local union grocery stores have closed one after another, resulting in many unemployed people. One of the many insidious aspects of late capitalism is its ability to force a competition between time-saving and wage-saving.  The convenience of technology necessitates further trust in and reliance on the rest of society. Or, as PJ Rey puts it: “There is no such thing as a lone cyborg.” What we often ignore, however, is how our choice of convenience simultaneously necessitates that our local community also become more reliant on large infrastructures and less self-sustaining. As Christian Fuchs explains in Labor in Informational Capitalism and on the Internet, the unemployed class is an inevitable byproduct of technological progress in a capitalist society that must be continuously deprived of wage labor and capital. more...

21359643669_fa6ab6e1d8_z

“The founding practice of conspiratorial thinking” writes Kathleen Stewart, “is the search for the missing plot.” When some piece of information is missing in our lives, whether it is the conversion ratio of cups to ounces or who shot JFK, there’s a good chance we’ll open up a browser window. And while most of us agree that there are eight ounces to every cup, far fewer (like, only 39 percent) think Lee Harvey Oswald acted alone. Many who study the subject point to the mediation of the killing –The Zapruder film, the televised interviews and discussions about the assassination afterward—as one of the key elements of the conspiracy theory’s success. One might conclude that mediation and documentation cannot help but provide a fertile ground for conspiracy theory building.

Stewart goes so far as to say “The internet was made for conspiracy theory: it is a conspiracy theory: one thing leads to another, always another link leading you deeper into no thing, no place…” Just like a conspiracy theory you never get to the end of the Internet. Both are constantly unfolding with new information or a new arrangement of old facts. It is no surprise then, that with the ever-increasing saturation of our lives with digital networks that we are also awash in grotesque amalgamations of half-facts about vaccines, terrorist attacks, the birth and death of presidents, and the health of the planet. And, as the recently leaked documents about Facebook’s news operations demonstrate, it takes regular intervention to keep a network focused on professional reporting. Attention and truth-seeking are two very different animals.

The Internet might be a conspiracy theory but given the kind, size, and diversity of today’s conspiracy theories it is also worth asking a follow-up question: what is the Internet a conspiracy about? Is it a theory about the sinister inclinations of a powerful cabal? Or is it a derogatory tale about a scapegoated minority? Can it be both or neither? Stewart was writing in 1999, before the web got Social so she could not have known about the way 9/11 conspiracies flourished on the web and she may not have suspected our presidential candidates would make frequent use of conspiratorial content to drum up popular support. Someone else writing in 1999 got it right though. That someone was Joe Menosky and he wrote one of the best episodes of Star Trek: Voyager. Season 6, Episode 9 titled The Voyager Conspiracy. more...

Polling

Horse-race style political opinion polling is an integral a part of western democratic elections, with a history dating back to the 1800’s. Political opinion polling originally took hold in the first quarter of the 19th century, when a Pennsylvania straw poll predicted Andrew Jackson’s victory over John Quincey Adams in the bid for President of the United States. The weekly magazine Literary Digest then began conducting national opinion polls in the early 1900s, followed finally by the representative sampling introduced the George Gallup in 1936. Gallup’s polling method is the foundation of political opinion polls to this day (even though the Gallup poll itself recently retired from presidential election predictions).

While polling has been around a long time, new technological developments let pollsters gather data more frequently, analyze and broadcast it more quickly, and project the data to wider audiences. Through these developments, polling data have moved to the center of election coverage. Major news outlets report on the polls as a compulsory part of political segments, candidates cite poll numbers in their speeches and interviews, and tickers scroll poll numbers across both social media feeds and the bottom of television screens. So central has polling become that in-the-moment polling data superimpose candidates as they participate in televised debates, creating media events in which performance and analysis converge in real time. So integral has polling become to the election process that it may be difficult to imagine what coverage would look in the absence of these widely projected metrics. more...

twitter-politics

Image via TechCrunch

“How tall is Jeb Bush?” This was the question on (apparently) many people’s minds leading up to the February 13th CBSN GOP debate. Thanks to Google, in partnership with CBSN, we now know that Americans are asking the hard questions, like “What is Ted Cruz’s real name?” “Why did Ben Carson wait to go on stage?” and, of course, a real deal breaker for me as a voter, “How old is John Kasich’s wife?”  more...

bivens

Almost two years ago, Facebook waved the rainbow flag and metaphorically opened its doors to all of the folks who identify outside of the gender binary. Before Facebook announced this change in February of 2014, users were only able to select ‘male’ or ‘female.’ Suddenly, with this software modification, users could choose a ‘custom’ gender that offered 56 new options (including agender, gender non-conforming, genderqueer, non-binary, and transgender). Leaving aside the troubling, but predictable, transphobic reactions, many were quick to praise the company. These reactions could be summarized as: ‘Wow, Facebook, you are really in tune with the LGBTQ community and on the cutting edge of the trans rights movement. Bravo!’ Indeed, it is easy to acknowledge the progressive trajectory that this shift signifies, but we must also look beyond the optics to assess the specific programming decisions that led to this moment.

To be fair, many were also quick to point to the limitations of the custom gender solution. For example, why wasn’t a freeform text field used? Google+ also shifted to a custom solution 10 months after Facebook, but they did make use of a freeform text field, allowing users to enter any label they prefer. By February of 2015, Facebook followed suit (at least for those who select US-English).

There was also another set of responses with further critiques: more granular options for gender identification could entail increased vulnerability for groups who are already marginalized. Perfecting your company’s capacity to turn gender into data equates to a higher capacity for documentation and surveillance for your users. Yet the beneficiaries of this data are not always visible. This is concerning, particularly when we recall that marginalization is closely associated with discriminatory treatment. Transgender women suffer from disproportionate levels of hate violence from police, service providers, and members of the public, but it is murder that is increasingly the fate of people who happen to be both trans and women of color.

Alongside these horrific realities, there is more to the story – hidden in a deeper layer of Facebook’s software. When Facebook’s software was programmed to accept 56 gender identities beyond the binary, it was also programmed to misgender users when it translated those identities into data to be stored in the database. In my recent article in New Media & Society, ‘The gender binary will not be deprogrammed: Ten years of coding gender on Facebook,’ I expose this finding in the midst of a much broader examination of a decade’s worth of programming decisions that have been geared towards creating a binary set of users. more...

Lanius

In December, the Center for Data Innovation sent out an email titled, “New in Data: Big Data’s Positive Impact on Underserved Communities is Finally Getting the Attention it Deserves” containing an article by Joshua New. New recounts the remarks of Federal Trade Commissioner Terrell McSweeny at a Google Policy Forum in Washington, D.C. on data for social empowerment, which proudly lists examples of data-doing-good. As I read through the examples provided of big data “serving the underserved”, I was first confused and then frustrated. Though to be fair, I went into it a little annoyed by the title itself.

The idea of big data for social good is not “new in data”: big data have been in the news and a major concern for research communities around the world since 2008. One of the primary justifications, whether spoken or implicit, is that data will solve all of humanities biggest crises. Big data do not “deserve” attention: they are composed of inanimate objects without needs or emotions. And, while big data are having an “impact” on underserved communities, it is certainly not the unshakably positive, utopian impact that the title promises. more...

Alaska

Turn on your TV and I bet you can find a show about Alaska. A partial list of Alaska-themed reality shows airing between 2005 and today includes Deadliest Catch, Alaskan Bush People, Alaska the Last Frontier, Ice Road Truckers, Gold Rush, Edge of Alaska, Bering Sea Gold, The Last Alaskans, Mounting Alaska, Alaska State Troopers, Flying Wild Alaska, Alaskan Wing Men, and the latest, Alaska Proof, premiering last week on Animal Planet, a show that follows an Alaskan distillery team as they strive to “bottle the true Alaskan spirit.” And with Alaska Proof, I submit that we have saturated the Alaskan genre; we have reached Peak Alaska. We may see a few new Alaska shows, but it’s likely on the decline. I don’t imagine we have many Alaskan activities left yet unexplored.

Television programming remains a staple of American Leisure, even as the practice of television watching continues to change (e.g., it’s often done through a device that’s not a TV). As a leisure activity, consumers expect their TV to entertain, compel, and also, provide comfort. What content and forms will entertain, compel and comfort shift with cultural and historical developments. Our media products are therefore useful barometers for measuring the zeitgeist of the time.  Marshall McLuhan argues in The Medium is the Message that upon something’s peak, when it is on the way out, that thing becomes most clearly visible. And so, with Alaska peaking in clear view, I ask, what does our Alaskan obsession tell us about ourselves?    more...

Henri_Maillardet_automaton,_London,_England,_c._1810_-_Franklin_Institute_-_DSC06656

In the past few years, a subgenre of curiously self-referential feature stories and opinion pieces has begun to appear in many prominent magazines and newspapers. The articles belonging to this subgenre all respond to the same phenomenon – the emergence of natural language generation (NLG) software that has been composing news articles, quarterly earnings reports, and sports play-by-plays – but what they really have in common is a question on the part of the writer: “Am I doing anything that an algorithm couldn’t be doing just as well?” In some instances, titles like “If an Algorithm Wrote This, How Would You Even Know?” and “Can an Algorithm Write a Better News Story Than a Human Reporter?” place the author’s uniqueness in question from the outset; sometimes the authors of these pieces also force their readers to wonder whether the text they are reading was written by human or machine. In a New York Times Sunday Review essay from last year, for instance, Shelley Podolny subjects her readers to a mini-Turing Test, presenting two descriptions of a sporting event, one written by a software program and one written by a human, and asking us to guess which is which. (On the Times website, readers can complete an interactive quiz in which they deduce whether a series of passages were composed by automated or human authors.)

The two major companies involved in the development of algorithmic writing, Automated Insights and Narrative Science, have been around since 2007 and 2010, respectively. Narrative Science’s flagship software product is called Quill, while Automated Insights’s is called Wordsmith: quaint, artisanal names for software that seems to complete the long process that has severed the act of writing from the human hand over the past century and a half. The two companies initially developed their programs to convert sports statistics into narrative texts, but quickly began offering similar services to companies and later started expanding into data-driven areas of journalism. Such data-based reporting is what NLG software does well: it translates numerical representations of information into language-based representations of the same information. And while NLG programs have existed for several decades, they were mostly limited to producing terse reports on a limited range of subjects, like weather events and seismic activity. According to Larry Birnbaum, one of Quill’s architects, “Computers have known how to write in English for years. The reason they haven’t done so in the past is they had nothing to say, lacking access to a sufficient volume of information.”

As Birnbaum explains it, the new natural language generation software has been made possible – or rather, necessary – by the advent of Big Data. The prior limitations on the topics software programs could write about are disappearing, as all realms of human activity become subject to data processing. Joe Fassler notes in The Atlantic that “the underlying logic that drives [algorithmic writing] – scan a data set, detect significance, and tell a story based on facts – is powerful and vastly applicable. Wherever there is data . . . software can generate a prose analysis that’s robust, reliable, and readable.” Hence, automated journalism will continue to expand into less obviously data-driven realms of reporting as new sources of data become available for processing. Meanwhile, the Associated Press and Forbes, to name a few, are already publishing thousands of software-written articles. more...