Earlier this week, the New York Times ran yet another hilariously digital dualist piece on a new surveillance system that lets retailers follow customers’ every move. The systems, mainly through cameras tied into motion capture software, can detect how long you stared at a pair of jeans, or even the grossed-out face you made at this year’s crop of creepy, hyper-sexualized Halloween costumes. The New York Times describes this as an attempt by brick and mortar stores to compete with data-wealthy “e-commerce sites.” (Who says “e-commerce” anymore? Seriously, change your style guide.) Putting aside the fact that most major retailers are also major online retailers, making the implicit distinction in the article almost meaningless, the article completely misses the most important (and disturbing) part of the story: our built environment will be tuned to never-before-seen degrees of precision. We have absolutely no idea what such meticulously built spaces will do to our psyches. (more…)
(This is not the dive bar in question)
I’ve been thinking a lot over recent weeks about digital media, smartphones, and absence-vs.-presence, all of which was compounded by an interesting experience I had last weekend. On one particular night, 1:00 AM found me in a Lower East Side dive bar playing pinball with a friend from Brooklyn and a friend from D.C.; I was also chatting with a third friend (who was in D.C.) via text message and Snapchat between my pinball turns, and relaying parts of that conversation to our two mutual friends there with me in the bar. More people joined us shortly thereafter, madcap shenanigans ensued and, sometime around stupid o’clock in the morning, I started the drive back to where I was staying.
As I was getting up the next day, I recalled various scenes from the night before. One such scene was from the earlier end of being at the dive bar: Getting to hang out with three people I don’t see often was a nice surprise, and how neat was it that we’d all gotten to hang out together? A few seconds later, however, it hit me that my mental picture of that moment didn’t match my memory of it. What I remembered was being in the dive bar spending time with three friends, but I could only picture two friends lit by the flashing lights of so many pinball machines. I realized that Friend #3 had been so present to me through our digital conversation that my memory had spliced him into the dive bar scene as if he’d been physically co-present, even though he’d been more than 200 miles away.
I wasn’t entirely sure what to make of this. On the one hand, yay: My subconscious isn’t digital dualist? (more…)
What will happen if more apps start to play more important roles in more of our lives?
Last week I wrote about a pattern I’ve been seeing, one for which I wanted to create a new term. I’m still working on the terminology issue, but the pattern is basically this:
1) A new technology highlights something about our society (or ourselves) that makes us uncomfortable.
2) We don’t like seeing this Uncomfortable Thing, and would prefer not to confront it.
3) We blame the new technology for causing the Uncomfortable Thing rather than simply making it more visible, because doing so allows us to pretend that the Uncomfortable Thing is unique to practices surrounding the new technology and is not in fact out in the rest of the world (where it absolutely is, just in a less visible way).
The examples I sketched out last week were Klout and Facebook’s new “sponsored” status updates (which Jenny Davis has since explored in greater depth); this week, I’m going to take a look at ‘helpful’ devices and smartphone apps. (more…)
This post combines part 1 and part 2 of “Technocultures”. These posts are observations made during recent field work in the Ashanti region of Ghana, mostly in the city of Kumasi.
Part 1: Technology as Achievement and Corruption
An Ashanti enstooling ceremony, recorded (and presumably shared) through cell phone cameras (marked).
The “digital divide” is a surprisingly durable concept. It has evolved through the years to describe a myriad of economic, social, and technical disparities at various scales across different socioeconomic demographics. Originally it described how people of lower socioeconomic status were unable to access digital networks as readily or easily as more privileged groups. This may have been true a decade ago, but that gap has gotten much smaller. Now authors are cooking up a “new digital divide” based on usage patterns. Forming and maintaining social networks and informal ties, an essential practices for those of limited means, is described as nothing more than shallow entertainment and a waste of time. The third kind of digital divide operates at a global scale; industrialized or “developed” nations have all the cool gadgets and the global south is devoid of all digital infrastructures (both social and technological). The artifacts of digital technology are not only absent, (so the myth goes) but the expertise necessary for fully utilizing these technologies is also nonexistent. Attempts at solving all three kinds of digital divides (especially the third one) usually take a deficit model approach.The deficit model assumes that there are “haves” and “have nots” of technology and expertise. The solution lies in directing more resources to the have nots, thereby remediating the digital disparity. While this is partially grounded in fact, and most attempts are very well-intended, the deficit model is largely wrong. Mobile phones (which are becoming more and more like mobile computers) have put the internet in the hands of millions of people who do not have access to a “full sized” computer. More importantly, computer science, new media literacy, and even the new aesthetic can be found throughout the world in contexts and arrangements that transcend or predate their western counterparts. Ghana is an excellent case study for challenging the common assumptions of technology’s relationship to culture (part 1) and problematizing the historical origins of computer science and the digital aesthetic (part 2). (more…)
Reason #15,926 I love the Internet: it allows us to bypass our insane leaders israelovesiran.com
— allisonkilkenny (@allisonkilkenny) April 22, 2012
Sherry Turkle, Author of Alone Together and a New York Times opinion piece on our unhealthy relationship to technology.
Sherry Turkle published an op-ed in the Opinion Pages of the New York Times’ Sunday Review that decries our collective move from “conversation” to “connection.” Its the same argument she made in her latest book Alone Together, and has roots in her previous books Life on the Screen and Second Self. Her argument is straightforward and can be summarized in a few bullet points:
- Our world has more “technology” in it than ever before and it is taking up more and more hours of our day.
- We use this technology to structure/control/select the kinds of conversations we have with certain people.
- These communication technologies compete with “the world around us” in a zero-sum game for our attention.
- We are substituting “real conversations” with shallower, “dumbed-down” connections that give us a false sense of security. Similarly, we are capable of presenting ourselves in a very particular way that hides our faults and exaggerates our better qualities.
Turkle is probably the longest-standing, most outspoken proponent of what we at Cyborgology call digital dualism. The separation of physical and virtual selves and the privileging of one over the other is not only theoretically contradictory, but also empirically unsubstantiated. (more…)
The tech world and consumers at large have been buzzing amid recent reports/leaks which indicate that Google will, in the next year, come out with smartphone-esque glasses. Apparently, these devices, often dubbed “Terminator” glasses after the cyborg technology portrayed in the 1980s classic film by the same name, will overlay the physical world with digital data—augmenting our practices of looking. (more…)
Everybody knows the story: Computers—which, a half century ago, were expensive, room-hogging behemoths—have developed into a broad range of portable devices that we now rely on constantly throughout the day. Futurist Ray Kurzweil famously observed:
progress in information technology is exponential, not linear. My cell phone is a billion times more powerful per dollar than the computer we all shared when I was an undergrad at MIT. And we will do it again in 25 years. What used to take up a building now fits in my pocket, and what now fits in my pocket will fit inside a blood cell in 25 years.
Beyond advances in miniaturization and processing, computers have become more versatile and, most importantly, more accessible. In the early days of computing, mainframes were owned and controlled by various public and private institutions (e.g., the US Census Bureau drove the development of punch card readers from the 1890s onward). When universities began to develop and house mainframes, users had to submit proposals to justify their access to the machine. They were given a short period in which to complete their task, then the machine was turned over to the next person. In short, computers were scarce, so access was limited. (more…)
The recent and popular Hipstamatic war photos depict contemporary soldiers, battlefields and civilian turmoil as reminiscent of wars long since passed. War photos move us by depicting human drama taken to its extreme, and these images, shot with a smartphone and “filtered” to look old, create a sense of simulated nostalgia, further tugging at our collective heart strings. And I think that these photos reveal much more.
Hipstamatic war photographs ran on the front page of the New York Times [the full set] last November, and, of course, fake-vintage photos of everyday life are filling our Facebook, Tumblr and Twitter streams. I recently analyzed this trend ina long essay called The Faux-Vintage Photo, which is generating a terrific response. I argue that we like faux-vintage photographs because they provide a “nostalgia for the present”; our lives in the present can be seen as like the past: more important and real in a grasp for authenticity.
If faux-vintage photography is rooted in authenticity, then what is more real than war? If the proliferation of Hipstamatic photographs has anything to do with a reaction to our increasingly plastic, simulated, Disneyfied and McDonaldized worlds, then what is more gritty than Afghanistan in conflict? In a moment where there is a shortage of and a demand for authenticity (the gentrification of inner-cities, “decay porn” and so on), war may serve as the last and perhaps ultimate bastion of authenticity. However, as I will argue below, war itself is in a crisis of authenticity, creating rich potential for its faux-vintage documentation. (more…)
I am working on a dissertation about self-documentation and social media and have decided to take on theorizing the rise of faux-vintage photography (e.g., Hipstamatic, Instagram). From May 10-12, 2011, I posted a three part essay. This post combines all three together.
Part I: Instagram and Hipstamatic
Part II: Grasping for Authenticity
Part III: Nostalgia for the Present
a recent snowstorm in DC: taken with Instagram and reblogged by NPR on Tumblr
Part I: Instagram and Hipstamatic
This past winter, during an especially large snowfall, my Facebook and Twitter streams became inundated with grainy photos that shared a similarity beyond depicting massive amounts of snow: many of them appeared to have been taken on cheap Polaroid or perhaps a film cameras 60 years prior. However, the photos were all taken recently using a popular set of new smartphone applications like Hipstamatic or Instagram. The photos (like the one above) immediately caused a feeling of nostalgia and a sense of authenticity that digital photos posted on social media often lack. Indeed, there has been a recent explosion of retro/vintage photos. Those smartphone apps have made it so one no longer needs the ravages of time or to learn Photoshop skills to post a nicely aged photograph.
In this essay, I hope to show how faux-vintage photography, while seemingly banal, helps illustrate larger trends about social media in general. The faux-vintage photo, while getting a lot of attention in this essay, is merely an illustrative example of a larger trend whereby social media increasingly force us to view our present as always a potential documented past. But we have a ways to go before I can elaborate on that point. Some technological background is in order. (more…)
I am working on a dissertation about self-documentation and social media and have decided to take on theorizing the rise of faux-vintage photography (e.g., Hipstamatic, Instagram). To start fleshing out ideas, I am doing a three-part series on this blog: part one was posted Tuesday (“Hipstamatic and Instagram”) and part two yesterday (“Grasping for Authenticity”). This is the last installment.
taken recently, this is a simulated vintage image of a simulation
With more than two million users each, Hipstamatic and Instagram have ushered a wave of simulated retro photographs that have populated our social media streams. Even a faux-vintage video application is gaining popularity. The first two posts in this series described what faux-vintage photography is, its technical facilitators and attempted to explain at least one main reason behind its explosive popularity. When we create an instant “nostalgia for the present” by sharing digital photos that look old and often physical, we are trying to capture for our present the authenticity and importance vintage items possess. In this final post, I want to argue that faux-vintage photography, a seemingly mundane and perhaps passing trend, makes clear a larger point: social media, in its proliferation of self-documentation possibilities, increasingly positions our present as always a potential documented past.
Nostalgia for the Present
The rise of faux-vintage photography demonstrates a point that can be extrapolated to documentation on social media writ large: social media users have become always aware of the present as a potential document to be consumed by others. Facebook fixates the present as always a future past. Be it through status updates on Twitter, geographical check-ins on Foursquare, reviews on Yelp, those Instagram photos or all of the other self-documentation possibilities afforded to us by Facebook, we view our world more than ever before through what I like to call “documentary vision.” (more…)