commentary

Whether they’ve joined me on Twitter, sneakily coerced me into spending more time on Facebook, or just like to go on at length about how social networking sites are “stupid and a waste of time,” it seems my friends never tire of talking to me about social media. Given my line of work, this is pretty great: it means a never-ending stream of food for thought (or “networked field research,” if you will). This post’s analysis-cum-cautionary tale comes to you through my friend Otto (we’ll refer to him by his nom de plume), who got himself into some pseudonuptial trouble last week.

It started when Otto was invited to a “wedding party”— more...

via bodyart.blog

 

Back in October, Nathan Jurgenson (@nathanjurgenson) created a typology of digital dualism, which I followed by mapping this typology onto material conditions that vary in terms of physical-digital enmeshment. Today, I want to apply this typology and its material-mapping to discourses and conditions of embodiment in light of technological advancements. If you have been following the blog and are up-to-date with this line of discussion, feel free to scroll down past the review. more...

This and more #OverlyHonestMethods can be found here.

I really love putting things in order: Around my house you’ll find tiny and neat stacks of paper, alphabetized sub-folders, PDFs renamed via algorithm, and spices arranged to optimize usage patterns. I don’t call it life hacking or You+, its just the way I live. Material and digital objects need to stand in reserve for me, so that I may function on a daily basis. I’m a forgetful and absent-minded character and need to externalize my memory, so I typically augment my organizational skills with digital tools.  My personal library is organized the same way Occupy Wall Street organized theirs, with a lifetime subscription to LibraryThing. I use Spotify for no other reason that I don’t want to dedicate the necessary time to organize an MP3 library the way I know it needs to be organized. (Although, if you find yourself empathizing with me right now, I suggest you try TuneUp.) My tendency for digitally augmented organization has also made me a bit of a connoisseur of citation management software. I find little joy in putting together reference lists and bibliographies, mainly because they can never reach the metaphysical perfection I demand. Citation management software however, gets me close enough. When I got to grad school, I realized by old standby, ProQuest’s Refworks wasn’t available and my old copy of Endnote x1 ran too slow on my new computer. So there I was, my first year of graduate school and jonesing heavily for some citation management. I had dozens of papers to write and no citation software. That’s when I fell into the waiting arms of Mendeley. more...

via durdom.in.ua

The past U.S. elections season was exciting for social scientists for many reasons, but none so much for the web theorist crowd as the amazing proliferation of election memes. In his essay “Speaking in Memes”, Nathan Jurgenson aptly dissects the phenomenon and its various facets: why and how election memes become viral, whether this virality is subject to campaign control, and how audiences and media conjure meaning by rebroadcasting and reporting these memes. There are many things I would love to further discuss in Jurgenson’s essay, but I will latch on to the issue of meme longevity and the possible reasons for some memes surviving far longer than most. I will also attempt to speculate about factors that afford memes the power to shift shape and adapt to new contexts, and about how and why their meaning might be transformed by the public in the process. more...

I’ve poked fun at these lazy op-eds before and, indeed, it must be tempting to retreat into the safe conceptual territory of “The Internet is fake!” when a juicy story of lies, deception, and computers makes headlines. The Te’o case is an almost unbelievable account of a football star allegedly tricked into falling for, and eventually mourning, a woman who didn’t exist. It’s the kind of fiction only non-fiction could invent. [More on the Te’o case]

What I’d like to point out is that people have incorrectly called this a cyber-deception, a digital-deception, an online-hoax, when this is not exactly right: it was a deception, and one that happened to involve digital tools in a significant way. This mistake is what I call “digital dualism”: conceptually dividing the digital and physical into separate realities. Dualists speak of “real” interaction as opposed to digital interaction, digital selves, and a digital life, like Neo jacking into The Matrix. [More. On. Digital. Dualism.] more...

The New Aesthetic and critiques of digital dualism have much in common: they emerged in the same year; the nature of their conclusions are (partly) formed by the method of their construction – that is to say, they originated in the digital and as such are collaborative, speculative, and ongoing; and they both seem to spring from the close attention paid to the enticing bangs, whoops, and crashes issuing from the overlap of our digital and physical worlds. But more than this, I would argue that the two concepts are expressions of one another: it sometimes seems that New Aesthetic artwork is an illustration of digital dualist critiques; and likewise it is possible to read digital dualist critiques as descriptions of New Aesthetic artefacts.

This may appear an undeservedly grand claim, and it is certainly extremely speculative (the concepts involved are too varied and inchoate to ever be proper ‘expressions’ of one another). Nevertheless, as a comparison it does produce some interesting conclusions. Especially when we begin to consider what exactly it is that the New Aesthetic and ‘augmented reality’ theory have in common (I’m using the term  ‘augmented reality’ here as a broad label for thought that runs counter to strict digital dualism). In my own personal analysis the mechanism that links the pair is undoubtedly that of metaphor; that cognitive tool that provides us with mental purchase on abstract, complex concepts and systems. In this case the abstract system is none other than the digital world of the 21 st century. more...

EDIT (17 January 2013): Please see update below.

The scene is San Francisco, late 2009, and a friend is explaining—animatedly, excitedly—“why there are so many poly [polyamorous] people on OkCupid.” I wasn’t paying much attention to online dating at the time, so the precise details are fuzzy, but it basically boiled down to the options that OkCupid offered for “relationship status.” In addition to the expected categories like “single,” “seeing someone,” or “married” offered by other social networking and dating sites, OkCupid offered the label “available.”

To know why this distinction would matter to someone who identifies as “poly” or “polyamorous” (as did this particular friend), you need to know a little bit about what polyamory is. more...

There’s a song on the 1997 Chemical Brothers album Dig Your Own Hole that reminds one of your authors of driving far too fast with a too-close friend through a flat summer nowhere on a teenage afternoon (windows down, volume up). It’s called “Where Do I Begin,” and the lyric that fades out repeating as digital sounds swell asks: Where do I start? Where do I begin?*

Where do we start, or begin–and also, where do we stop? What and where is the dividing line between “you” and “not you,” and how can you tell? This is the first of a series of posts in which we will try to answer these sorts of questions by developing a theory of subjectivity specific to life within augmented reality.

As a thought experiment, consider the following: Your hand is a part of “you,” but what if you had a prosthetic hand? Are your tattoos, piercings, braces, implants, or other modifications part of “you”? What about your Twitter feed, or your Facebook profile? If the words that come from your mouth in face-to-face conversation (or from your hands, if you sign) are “yours,” are the words you put on your Facebook profile equally yours? Does holding a smartphone in your hand change the nature of what you understand to be possible, or the nature of “you” yourself? Theorists such as Donna Haraway, N. Katherine Hayles, Bruno Latour, and others have asked similar questions with regard to a range of different technologies. Here, however, we want to think specifically about what it means to be a subject in an age of mobile computing and increasingly ubiquitous access to digital information. more...

Please excuse the Atlantic Magazine-worthy counterintuitive article title, but its true. The Consumer Electronics Show, more commonly referred to as CES is cheesy, expensive, and out-dated. I used to really love CES coverage. It was a guilty pleasure of mine; an unrequited week of fetishistic gadget worship. I savored it all: the cringe-worthy pep of the keynote addresses, the garbled and blurry product videos taken by tech blog contributors, the over-hyped promises that never come true. But this year, after watching the entire 90-minute Waiting for Godot-style keynote, I don’t see the point anymore. All the coolest stuff was made by indie developers and they introduced their products months ago, through awkward in-house YouTube videos. CES might be convenient for gigantic multinational corporations, but what’s in it for the Kickstarter-fueled entrepreneurs?  Is a Las Vegas trade show the best medium to show off your iPhone-controlled light bulb, or e-ink wrist watch?  Why does half of Maroon 5 need to half-heartedly churn out three songs at the end of an hour-long product description? The industry has matured, and CES is no longer sufficient.  more...

Given that we’re not in the habit of thinking too much where our technological passions might lead us, I’ve been heartened over the past year to see an unusual willingness to confront the potentially devastating impact of the robotics revolution on human employment.

It was a question that was hard to avoid, given the global recession and the widening gap between rich and poor. It’s obvious that rapid advances in automation are offering employers ever-increasing opportunities to drive up productivity and profits while keeping ever-fewer employees on the payroll. It’s obvious as well that those opportunities will continue to increase in the future. more...