Search results for google pin

I’m posting to get some feedback on my initial thoughts in preparation for my chapter in a forthcoming gamification reader. I’d appreciated your thoughts and comments here or @pjrey.

My former prof Patricia Hill Collins taught me to begin inquiry into any new phenomenon with a simple question: Who benefits? And this, I am suggesting, is the approach we must take to the Silicon Valley buzzword du jure: “gamification.” Why does this idea now command so much attention that we feel compelled to write a book on it? Does a typical person really find aspects of his or her life becoming more gamelike? And, who is promoting all this talk of gamification, anyway?

It’s telling that conferences like “For the Win: Serious Gamification” or “The Gamification of Everything – convergence conversation” are taking place in business (and not, say, sociology) departments or being run by CEOs and investment consultants. The Gamification Summit invites attendees to “tap into the latest and hottest business trend.” Searching Forbes turns up far more articles (156) discussing gamification than the New York Times (34) or even Wired (45). All this makes TIME contributor Gary Belsky seems a bit behind the time when he predicts “gamification with soon rule the business world.” In short, gamification is promoted and championed—not by game designers, those interested in game studies, sociologists of labor/play, or even computer-human interaction researchers—but by business folks. And, given that the market for videogames is already worth greater than $25 billion, it shouldn’t come as a surprise that business folk are looking for new growth areas in gaming.

more...

Presidential debates might be the single political event where Marshall McLuhan’s infamous phrase “the medium is the message” rings most true. Candidates know well that content takes the back seat, perhaps even stuffed in the trunk, during these hyper-performative news events. The video above of McLuhan on the Today show analyzing a Ford-Carter debate from 1976 is well worth a watch. The professor’s points still ring provocative this morning after the first Obama-Romney debate of 2012; a debate that treated the Twitter-prosumer as a television-consumer and thoroughly failed the social medium.  more...

Nothing like a nice post-gentrification stoll!

As if we needed more examples to demonstrate that ‘the digital’ & ‘the physical’ are part of the same larger world, it seems there’s no end to the applicability of demographic metaphors to trends in social media. I wrote about App.net and “white flight” from Facebook and Twitter last month, so you can imagine how my head broke on Monday when I first heard about “New MySpace.” My first question—after, “wait, what?”—was, “Is this like when the white people start moving back into urban cores to live in pricey loft conversions?”

I didn’t do a detailed overview of danah boyd’s (@zephoria) work on MySpace, Facebook, and white flight last time, so I start with that below (though I recommend that anyone interested in this topic check out boyd’s very readable chapter in Race After the Internet, which you can download here [pdf]). I then look at some of the coverage of New MySpace this week to make the argument that there are some strong parallels between the site’s impending “makeover” and the “urban renewal” efforts sometimes called gentrification or regentrification.

more...

There are any number of ways to frame the apocalypse, I suppose. As one who spends a lot of time thinking about technology, mine is a phenomenon known as “technological autonomy.”

I’m convinced that technological autonomy may be the single most important problem ever to face our species and the planet as a whole. A huge statement, obviously, but there’s plenty of recent evidence to back it up.

Briefly stated, technological autonomy describes a condition in which we’re no longer in control of our technologies: they now function autonomously. This isn’t as absurd as it may sound. The idea isn’t that we can’t switch a given machine from “on” to “off.” Rather, technological autonomy acknowledges the fact that we have become so personally, economically, and socially committed to our devices that we couldn’t survive without them. more...

Giorgio Fontana (1981) is an Italian writer, freelance contributor and editor of Web Target (http://www.web-target.com/en/). His personal website is www.giorgiofontana.com. On Twitter: https://twitter.com/giorgiofontana.

In some very stimulating articles – mainly this one – Nathan Jurgenson has convincingly argued against what he calls digital dualism: that is, to think that “the digital world is virtual and the physical world real”:

more...

Can we create quantitative data that will help us make sense of our emotions?

In preparation for the 2012 Quantified Self Conference on 15 and 16 September (#QS2012), I’m spending a couple weeks writing about the “self knowledge through numbers” group Quantified Self. Last week, I focused on self-quantification in relation to my masters work on what I’ve termed biomedicalization 2.0; this week, I focus on my upcoming dissertation project, which will look specifically at emotional self-quantification (or “mood tracking”).

more...

In preparation for the 2012 Quantified Self Conference on 15 and 16 September (#QS2012), I’ll be spending the next two weeks writing about the “self knowledge through numbers” group Quantified Self (@QuantifiedSelf). This week, I focus on self-quantification in relation to my masters work on what I’ve termed biomedicalization 2.0; next week I’ll focus on my upcoming dissertation project, which will look specifically at emotional self-quantification (or “mood tracking”).

more...

We all know the trope: In The Future—near or distant—food will come in the form of a pill. The pill will offer optimal proportions of all necessary nutrients. It will be calorically dense, vitamin infused, moderately fatted, protein filled, fiber enhanced, time released, and highly precise. The consumer will be satiated. The body will be healthy. This is a pill of perfect consumptive efficiency. This is the predicted diet of the cyborg.

Indeed, as cyborgs, our practices of (literal) consumption are characterized by scientific engineering. Our food and food practices are more a product of laboratory and factory work than the sweat of tilling farmers. And yet, we have not come up with a successful food-replacement pill. Instead, we’ve generally (though not ubiquitously) developed a market and a mindset that  moves away from efficiency, developing and utilizing technological advancements to maximally consume with minimal caloric absorption. I offer here a few examples: more...

This post combines part 1 and part 2 of “Technocultures”. These posts are observations made during recent field work in the Ashanti region of Ghana, mostly in the city of Kumasi.

Part 1: Technology as Achievement and Corruption

An Ashanti enstooling ceremony, recorded (and presumably shared) through cell phone cameras (marked).

The “digital divide” is a surprisingly durable concept. It has evolved through the years to describe a myriad of economic, social, and technical disparities at various scales across different socioeconomic demographics. Originally it described how people of lower socioeconomic status were unable to access digital networks as readily or easily as more privileged groups. This may have been true a decade ago, but that gap has gotten much smaller. Now authors are cooking up a “new digital divide” based on usage patterns. Forming and maintaining social networks and informal ties, an essential practices for those of limited means, is described as nothing more than shallow entertainment and a waste of time. The third kind of digital divide operates at a global scale; industrialized or “developed” nations have all the cool gadgets and the global south is devoid of all digital infrastructures (both social and technological). The artifacts of digital technology are not only absent, (so the myth goes) but the expertise necessary for fully utilizing these technologies is also nonexistent. Attempts at solving all three kinds of digital divides (especially the third one) usually take a deficit model approach.The deficit model assumes that there are “haves” and “have nots” of technology and expertise. The solution lies in directing more resources to the have nots, thereby remediating the digital disparity. While this is partially grounded in fact, and most attempts are very well-intended, the deficit model is largely wrong. Mobile phones (which are becoming more and more like mobile computers) have put the internet in the hands of millions of people who do not have access to a “full sized” computer. More importantly, computer science, new media literacy, and even the new aesthetic can be found throughout the world in contexts and arrangements that transcend or predate their western counterparts. Ghana is an excellent case study for challenging the common assumptions of technology’s relationship to culture (part 1) and problematizing the historical origins of computer science and the digital aesthetic (part 2). more...

Academic conferences: the model needs to change.

As the 2012 meeting of the American Sociological Association (#ASA2012) kicks into gear, I want to use this post to start a conversation about a somewhat-contentious topic: academics’ use of Twitter, particularly at conferences. I begin by extending some of what’s already been written on Cyborgology about the use of Twitter at conferences, and then consider reasons why some people may find Twitter use off-putting or intimidating at conferences. I close by considering what Twitter users in particular can do to ease the “Twitter tensions” at ASA by being more inclusive. The stakes here include far more than just “niceness”; they include as well an opportunity to shape the shifting landscape of scholarly knowledge production.

more...