In the first chapters of every Economics 101 textbook there’s a misleading hypothetical about the origins of money. David Graeber, in his book Debt: The First 5,000 Years calls it “the founding myth of our system of economic relations.” This myth is so pervasive that even people who have never taken an Economics 101 class know, and believe in, this myth. We tend to assume that before money there was this awkward barter system where you had to keep all your chickens and yams with you when you went to market to buy a calf. If the person selling the calf didn’t want chicken or yams, no transaction would take place. Money seems to fill a very important need: it lets us compare and exchange a wide variety of goods by establishing a common metric of value. The problem with this construction—of simple barter being replaced with cash economies—is that it never happened. That’s what makes Bondsy, an app that let’s you effortlessly barter with a private set of friends, so interesting: It takes a modern myth and turns it into everyday reality. (more…)
The Cyborg project, as articulated by Haraway, is at its core, a utopic project. It is the melding of mechanical and organic, digital and physical, human, machine, and animal in such a way that categorizations cease to hold meaning, and in turn, cyborg bodies break through repressive boundaries.
And yet here we are, at the pinnacle of a cyborg era, inundated with high tech, engaged simultaneously in digital and physical spaces, maintaining relationships with organic and mechanical beings, constituted with and through language, medicines—and increasingly—machines, and we STILL have to deal with bullshit like this (click below to view):
Watching the ideas materialize, disseminate, get knocked down and picked back up all in near real time is either the greatest advantage digital dualism theory has, or its biggest downfall—its best feature or worst flaw. Or both. Personally, I’m having a blast, even if it’s a bit of a distraction from my dissertation. It’s the spirit of this blog, a rare academic space to try ideas out, work on them, debate them, meet new people, and watch the idea, one hopes, get better and stronger. Or sometimes no one cares and we move on. This is what I love about my colleagues on Twitter (I’ll never type the word tweeps), this blog, and the Theorizing the Web conference.
The drawback is (more…)
my bad photo with lots of bokeh blur will get lots of facebook likes
Stories In Focus, posted by Sarah Wahnecheck two days ago, is a brief exploration of Bokeh that strikes me as a great start to something bigger. This is just a quick followup, asking Sarah and others to think more about the reality that amateur, documentary and news footage is increasingly coming to look like art films, specifically the effect of having one thing in sharp focus with the rest blurred and out of focus. (more…)
The Sheriff says “Likeing” isn’t speech, but he’ll fire you if you “Like” the wrong thing.
Ann Swidler argues that we operate using complex cultural repertoires. These are the propensities, scripts, frameworks, and logics—the tools with which to navigate everyday life. Our repertoires are vast, and often contradictory—and yet we deftly pull what we need, when we need it, easily ignoring contradictions. She illustrates these practices through narratives of romantic love, in which participants, within the same interview, draw seamlessly on logics of independence (e.g. we are separate people and we need our separate space), intertwinement (e.g. we have grown together over the years, our marriage is a true union of two souls), fate (e.g. we were meant to be) and rationality (e.g. marriage is a product of hard work and sacrifice).
With Swidler’s cultural tool kit as a framework, we can begin to make sense of the logical gymnastics that enabled a Virginia Sheriff to fire his subordinates for hitting a Facebook “Like” button in support of an opposing candidate and then argue successfully in court that this firing was not a violation of free speech. (more…)
An Ashanti enstooling ceremony, recorded (and presumably shared) through cell phone cameras (marked).
The “digital divide” is a surprisingly durable concept. It has evolved through the years to describe a myriad of economic, social, and technical disparities at various scales across different socioeconomic demographics. Originally it described how people of lower socioeconomic status were unable to access digital networks as readily or easily as more privileged groups. This may have been true a decade ago, but that gap has gotten much smaller. Now authors are cooking up a “new digital divide” based on usage patterns. Forming and maintaining social networks and informal ties, an essential practices for those of limited means, is described as nothing more than shallow entertainment and a waste of time. The third kind of digital divide operates at a global scale; industrialized or “developed” nations have all the cool gadgets and the global south is devoid of all digital infrastructures (both social and technological). The artifacts of digital technology are not only absent, (so the myth goes) but the expertise necessary for fully utilizing these technologies is also nonexistent. Attempts at solving all three kinds of digital divides (especially the third one) usually take a deficit model approach.The deficit model assumes that there are “haves” and “have nots” of technology and expertise. The solution lies in directing more resources to the have nots, thereby remediating the digital disparity. While this is partially grounded in fact, and most attempts are very well-intended, the deficit model is largely wrong. Mobile phones (which are becoming more and more like mobile computers) have put the internet in the hands of millions of people who do not have access to a “full sized” computer. More importantly, computer science, new media literacy, and even the new aesthetic can be found throughout the world in contexts and arrangements that transcend or predate their western counterparts. Ghana is an excellent case study for challenging the common assumptions of technology’s relationship to culture (part 1) and problematizing the historical origins of computer science and the digital aesthetic (part 2). (more…)
Google’s “Project Glass,” is the Augmented Reality (AR) Heads-up-Display (HUD) glasses offering that Google is designing for a near future Internet interactive experience.
(Video credit: Google)
From watching their demonstration video, I certainly have some questions and observations. Google’s vision (no pun intended) of the future is a place where people ignore women except as witnesses to their achievements, talk with their mouth full, and put their live friends on hold to interact with a machine (oh wait, that’s what people do now); and is one without ads (wait…what?). Thankfully, rebelliouspixels mixed them in: (more…)
In an earlier post, I wrote about the intersections of gender, technology, and economy using Apple’s “personal assistant” Siri as an example. With the recent release of the Japanese version of Siri, I thought I would provide an update on the available languages and their use of a default masculine or feminine voice.
MVS Virtual Cable™ and Virtual Signs™
In early February, I attended a fascinating conference hosted by the Telecom Council of Silicon Valley. This is a first rate organization and the conference did not disappoint. Many executives were present from various telecom, mobile, middleware, AR, audio, video, electronics and computer companies to discuss the future of the “connected car.”
The car is apparently one of the next battlefields for ownership of our personal data and privacy. It is an intimate environment and there will soon be enough sensors to document every human habit and behavior within it. While cars will become the panoptic reporter to our every move, people will also be burdened with an overwhelming amount of data ostensibly aimed at “aiding” them in the driving task. There will be touch activated windshields, Augmented Reality (AR) navigation lines projected onto the windshield that guide drivers on a track of navigation, and the blending of both scenarios with the addition of ads showing up on screen. Audio feedback based on sensor activity is currently available as a service in certain commercial vehicles. Installed sensors monitor driver behavior and provide immediate audio feedback if a driver changes lanes suddenly, is speeding or engages in other unsafe behaviors. (more…)
Below is a three part essay I presented at the 2012 Southwest Texas Popular Culture Association meetings in Albuquerque, New Mexico on February 9th. It was presented as part of a series of panels titled “The Apocalypse in Popular Culture.” A (much) earlier version of this paper can be found on the Sociological Images sister blog.
THE ZOMBIE IN FILM: FROM HAITIAN FOLKLORE TO APOCALYPTIC ANXIETIES
If you are alive these days, and not already part of the undead masses yourself, you probably have noticed a staggering increase of zombie references in film, television, pop culture, videogames and the internet.For instance, the big screen and small screen have both hosted a plethora of zombie films including the more popular blockbusters 28 Days Later
(2002), Shaun of the Dead
and I Am Legend
In television, we have seen the recent success of AMC’s The Walking Dead,
based on the comic book series
of the same name. In pop culture, we have seen the viral video of penitentiary inmates dancing to Michael Jackson’s “Thriller”
and even the popular television sitcom Glee
host its own rendition of the dance. And if you are on a college campus like myself, you have probably seen undergraduates playing “Zombies Vs. Humans,”
a game of tag in which “human” players must defend against the horde of “zombie” players by “stunning” them with Nerf weapons and tube socks. In videogames, we have seen the success of the Resident Evil
franchise, eventually culminating in a series of films staring Mila Jovovich
, as well as more recent games like Left 4 Dead
and Dead Rising.
Finally, the internet is awash with zombie culture. From post-apocalyptic zombie societies
to zombie fansites
The Annual "Zombie Walk" in Pittsburgh, PA, birthplace of the famed zombie director George Romero.