I still want to be a cyborg

we’ll have a crack team of GIF artists cranking out instant animations of the best debate moments

And independent voters? The top term was “LOL,” short for laugh out loud

bad photos have found their apotheosis on social media, where everybody is a photographer

I am only as secure as the last time I was retweeted

everyone else seemed so natural in their tweeting. for me it was agony

the friction of the digital divide in academia requires only the slightest irritation to hit a rolling bubbling, um, boil

it’s strange to write a serious research proposal & have half of your bibliography be science fiction

“let’s stop shaming teenagers for exploring sexual imagery through the cell phone shutter, instead of our own lens of 1960s nostalgia

Pinterest is now jammed with inspirational quotes, some of which could have been lifted from fortune cookies

the hate-blog phenomenon is basically anti-fandom

low-tech objects that are the paraphernalia of hipster culture

the public assumes that what is printed or pressed or somehow physically produced is of better quality

the only way to not be used by the Internet is either to not use it, which is ridiculous, or to make something out of it

is Klout trying to smack a glossy veneer of Science™ onto social ranking?

Follow Nathan on Twitter: @nathanjurgenson

via

Presidential debates might be the single political event where Marshall McLuhan’s infamous phrase “the medium is the message” rings most true. Candidates know well that content takes the back seat, perhaps even stuffed in the trunk, during these hyper-performative news events. The video above of McLuhan on the Today show analyzing a Ford-Carter debate from 1976 is well worth a watch. The professor’s points still ring provocative this morning after the first Obama-Romney debate of 2012; a debate that treated the Twitter-prosumer as a television-consumer and thoroughly failed the social medium. 

Can we imagine a theorist of social media on the Today show this morning in 2012? Sigh. Moving on…social media lit up during the debate last night, primarily on Twitter. This was the most tweeted American political event and the campaigns continued to update Twitter during the event even when letting their Facebook and Google Plus pages go silent. Much of the discussion, from my anecdotal experience, was that the biggest loser was the very format of the debate itself.

As McLuhan states above about the Ford-Carter debate,

the medium finally rebelled against the most stupid arrangement of any debate in the history of debating

The sentiment was similar last night. While most agree that Obama lost to Romney, the biggest loser seemed to be the format, the lack of proper hosting from Lehrer, the banality of an event so focused on the television image. This was a television debate; neither candidate nor the host took social media as a medium seriously. “There were no zingers; no knockout blows; no major blunders”: Romney’s offense and Obama’s long-winded defense simply were not retweetable. There was a lack of sound-bytes, the content was astonishingly non-viral, the performance was even surprisingly non-GIF-able (Tumblr live-GIF’d the debate, but seemed to find little to work with).

Indeed, the message that most proliferated across social media, especially Twitter, had to do with Big Bird. After Romney said that, while he liked Big Bird, he would cut funding to PBS, there was an immediate reaction from Twitter users collectively knowing that this was something to tweet about. Finally, something that we all could comment on and get to the work of not just consuming the debate but also to start producing it. We then got into making the message ourselves: tweeting about Romney and Big Bird, making fake Big Bird twitter handles that quickly garnered tens of thousands of followers, spinning off humorous images and GIFs, and even making Big Bird and Sesame Street “trending topics.”

It is conspicuous, however, that the Big Bird moment is the only one that garnered such attention from the debate. Social media users expect to prosume the debates, that is, simultaneously produce and consume the experience, rather than the more consumer experience of television-only. The debate failed the social medium last night because it treated millions of prosumers as if they were consumers. Almost nothing about Wall Street or Main Street was as shareable as Sesame Street.

Neither candidate seemed to have any awareness of virality. The twitterdrome was like dry kindling ready to come alight at the mere mention of, say, “the 47%” by Obama. The non-mention of the 47% proves the debate was misaligned with social media as a medium.

Many will argue about whether social media either changes or simply amplifies mainstream media narratives, and I tend to think that it depends, from event to event, on the content. When the message succumbs to the social medium by becoming that which is most shareable, I believe that social media can indeed change the narrative. This was the case when Clint Eastwood “stole” the Republican National Convention when mainstream outlets were more likely to focus on Romney’s performance. Last night, Twitter users were very likely more concerned with Big Bird than their television-only counterparts. But Big Bird is minor and neither candidates did much of anything that was shareabe (positive or negative) and thus it is my opinion that social media was not used to set the overall media narrative. My guess is that, today, many will then claim that this is always true, that social media always amplifies mainstream media, but I do not think that is correct. It depended on candidates doing and saying things that are retweetable, reblogable, GIFable, “likeable” and so on.

All this raises many questions: how might the debate format better account for social media? How might the format and the candidates perform their message subservient to the social medium, or, simply, how should candidates make their message most positively shareable? Or should they at all? Might presidential politics be better suited to television and not social media?

Follow Nathan on Twitter: @nathanjurgenson


The video above is a “funny” take on the role of Twitter in our everyday lives from this past summer (I think). I know, who cares about celebrities and nothing is less funny than explaining why something is funny. But because the video isn’t really that funny to begin with, we’ve nothing to lose by quickly hitting on some of the points it makes. Humor is a decent barometer for shared cultural understanding for just about everything, indeed, often a better measure than the op-eds and blog posts we usually discuss in the quasi-academic-blogosphere. Those who made this video themselves are trying to tap into mainstream frustrations with smartphones and social media and their increasingly central role in many of our lives. So let’s look at the three main themes being poked at here, and I’m going to do my best to keep this short by linking out to where I’ve made these arguments before.

Twitter Triviality
This is a common trope deployed against all things digital: ebooks are shallow, digital activism is slacktivism, and Twitter usually gets the brunt of it. Whether it is because of its childlike name, the short 140 character limit to each tweet, the fast-paced and often ephemeral discussions, or the fact that few Americans actually use the service, the very mention of Twitter in a room often induces laughter. I’d like to ask folks here why the digital is equated as more trivial in general, and why Twitter gets this attack specifically?

My first reaction is that if someone is tweeting things you find trivial, don’t follow them. Social media is like a radio, it’s all in how you tune it; if your Twitter stream is a bunch of tweets about lunch, you’re doing it wrong. Stop following those people (unless you want to read those tweets). Don’t blame the site for this. Follow great comedians and you get a stream of some of the funniest people telling you jokes all day. Follow top-notch journalists and you get a stream of them breaking and discussing news. For me, it’s being able to read a bunch of smart folks sharing links and discussing them. Ultimately, this is so obvious that there must be more going on here, and I think it has to do with disqualifying certain ways of knowing and speaking. All claims to “triviality” are claims to knowledge and thus power. More on this here, where I take Noam Chomsky to task for missing this point: http://www.salon.com/2011/10/23/why_chomsky_is_wrong_about_twitter/

Stop Tweeting about Life and Live It
This is another popular anti-social media trope: that we are all talking about life rather than experiencing it. I do think there is merit to the line of critique, but some nuance is needed in its deployment. Here, per usual, is the claim that social media is separate from life. Time spent tweeting is time spent not-living. This is a fallacy, what I call “digital dualism”, the incorrect assumption that social media is some other, cyber, space separate from “real” life. No, social media is real and, further, research has shown that time spent on social media is often associated with more time spent face-to-face. More on digital dualism here: https://thesocietypages.org/cyborgology/2011/02/24/digital-dualism-versus-augmented-reality/

The IRL Fetish
This is classic “IRL Fetish,” whereby the offline is obsessed over as if it is disappearing in order to make certain folks oh so very proud of themselves with their magic ability to put their phones away and live “real” life. This is not simply appreciating being logged off, but is a fetishization because it (1) neglects the fact that the offline and face-to-face are actually proliferating in part due to digital technologies and (2) it forgets that turning off the phone and getting away from the screen are actually not logging off at all. What happens when not looking at the screen is much of what we’ll eventually post to social media. More on this here:  http://thenewinquiry.com/essays/the-irl-fetish/

Sorry.

Follow Nathan on Twitter: @nathanjurgenson

40% of Twitter users who log in on a regular basis never tweet

Going viral was crippling

cyberpunk romanticisation of the ‘virtual’ plays a cultural role in propping up [digital dualism]

Drones will make traditional fences as obsolete as gunpowder & cannons made city walls

For the poor, there will be cyberspace

Percentage of folks living on a Native American reservation who have internet access: 10

Also, I find it important to make sure someone is real before meeting them, so hopefully you have a FB. This way you know that I am a real person and I know you are as well

“Gangnam Style” signals the emergence of irony in South Korea

The Enterprise crew was driving a misfiring IBM PC in the service of a quasi-neoliberal agenda

Data’s positronic brain doesn’t have Wi-Fi

here’s the order of what was important in my life: 1- Facebook 2- Myself 3- Food / Shelter 4- My gf 5- Family

Desired Skills: Klout Score of 35 or higher

Follow Nathan on Twitter: @nathanjurgenson


PJ Rey just posted a terrific reflection on hipsters and low-tech on this blog, and I just want to briefly respond, prod and disagree a little. This is a topic of great interest to me: I’ve written about low-tech “striving for authenticity” in my essay on The Faux-Vintage Photo, reflected on Instagrammed war photos, the presence of old-timey cameras at Occupy Wall Street, and the IRL Fetish that has people obsessing over “the real” in order to demonstrate just how special and unique they are.

While I appreciate PJ bringing in terrific new theorists to this discussion, linking authenticity and agency with hipsters and technology, I think he focuses too much on the technologies themselves and not enough on the processes of identity; too much on the signified and not where the real action is in our post-modern, consumer society: the signs and signifiers.

PJ argues that low-tech has risen in order to reclaim a mastery over things (technologies themselves) that has been lost in late-modernity. I would like to counter that it ain’t about the devices; the rise of retro-tech is about the need to demonstrate mastery and agency over one’s identity (ala Bourdieu’s Distinction, which is the topic PJ’s essay starts with but distances too far from, in my opinion). That people may or may not have more agency over their devices is of far less importance than demonstrating identity-authenticity: agency over who you are. I am not an automaton, I am a unique, special, creative, authentic, authentic, really authentic individual.

Identity-authenticity implies identity-agency, and while PJ is right to identify agency in his analysis, it is far more fruitfully applied to identity than technologies. Perhaps the quickest way to weaken PJ’s argument that low-tech is about some hipster-culture need to have always-increased agency over devices is the parallel popularity of iPhones and Pads in precisely the same communities that embrace low-tech. It is not uncommon to see an iPad riding friendly with a fixie, a Macbook hanging with a typewriter, an Instagrameotype dribbling down an iPhone screen. Apple’s devices are popular precisely because they obscure their inner workings, their manufacture, and operation. The hipster-appreciation of both bikes and smartphones demonstrate techno-agency ambivalence. The want of knowable tech comfortably co-exists with the want for unknowable tech, so long as both the knowable and unknowable devices provide perceived identity-authenticity.*

Diverging from PJ’s essay a bit, note that consumer society has compelled each of us to demonstrate and express our own special uniqueness and authenticity but, at the same time, gives us few tools to actually live up to such creative snowflake-like irreplaceable identity-exceptionalism. Instead, we have limited choices to express how distinct we are. This is just that old-school paradox of hipster-identity, and perhaps one of the primary causes of hipster-hate: claiming authenticity and rejecting identity-definition while simultaneously, and disingenuously, replicating a pre-set aesthetic. Let’s call it The Urban Outfitters Contradiction: be unique just like everybody else!

Many hipsters grew up in a Disneyfied, McDonaldized, fake-plastic suburban America and have since seen their life deeply infiltrated by the digital and the intangible. Bikes, especially those without derailleurs, let you travel distinct from those conformist car-drivers, faux-vintage Instgrams provide special importance over regular digital shots and so on. However, each of these distinctions from the mainstream are not indistinct from others buying into this pre-made authenticity pose. Propping that old camera on your bookshelf filled with real, printed, physical books and vinyl records resolves as just a new conformity, and, most importantly, furthers the intense, obsessive drive to find new ways of authenticity. The conundrum is never resolved; instead, we are left with a crisis of agency over our identities (sorry PJ, not over our devices), and we have little to do but squirm ahead trying to demonstrate our precious uniqueness…wallets open.

Follow Nathan on Twitter: @nathanjurgenson

*PJ does not see Apple as a counter-argument since the devices are simple to use. However, simplicity and knowability are very different things. Often, ease-of-use is gained from making devices less-knowable, making your own agency with respect to their manufacture, operation and use diminished. Further, the valorization of film cameras and other retro-tech is often about making the process more complex and less simple. Is an old-school Kodak Brownie more or less simple? Well, both. Same with an iPhone. Again, focusing on the agency over the technologies just makes all of this more muddled with lots of trends pointing in different directions. The much cleaner argument is not about device-simplicity because some hipster tech is more and less simple in different ways; instead, it is the losing-battle to assert identity-agency and authenticity. 

we may risk, in being so concentrated in demolishing digital dualism, overestimating just how enmeshed the digital and analogue are

I’ve lost remaining tolerance of people who talk about Facebook as if it’s all trivia. Mine is full of death & pain. As well as the mundane

Just had lovely dinner for a friend’s birthday, met interesting people, had a perfect night. No one took any photos. What a waste of time

If there’s anything Americans love more than expensive outdoor recreation equipment, bacon, and wars of choice, it’s innovation

Google is acting like a court, deciding what content it keeps up and what it pulls  — all without the sort of democratic accountability or transparency we have come to expect

how do we build and teach a new form of civics that takes advantage of what seems to work best offline and online?

If TED took a turn to leftist (or any) critique, Žižek, the professor of “toilets and ideology,” would be the keynote speaker

Ten, 20 years from now, the legacy of [Facebook] should be, we have connected everyone in the world

Becoming yourself is largely a matter of becoming someone who is paid attention to

Human self-awareness is multiplying itself onto an altogether new plane

If the internet ideal inspired the protest movements of the past year, it’s little wonder they’re struggling

Instagram is the new go-to platform for saying “I live a full life and here is photographic proof”

technological autonomy may be the single most important problem ever to face our species and the planet as a whole

Facebook’s basic material is the paradox of identity, the principle of self-presentation that can be undone by others

an uncritical embrace of automation, for all the efficiency that it offers, is just a prelude to dystopia

Analog stuff is popular online

Follow Nathan on Twitter: @nathanjurgenson

 

From: Branded For Life – http://www.buzzfeed.com/jackstuef/branded-for-life

In light of the recent Newsweek magazine cover scandal, let’s think for a moment on what a “troll” is and when we should or should not call someone or something a troll. My first reaction to the Islamaphobic cover was “trolling. ignore.” That was the exact wrong reaction.

Trolls, of course, are those who deliberately post inflammatory material in order to disrupt or derail discourse. Declaring something or someone a “troll” is a way of saying that they just want attention. Trolls attempt to disrupt productive communication in an attempt to get noticed. The one thing you need to know to do when this happens: don’t feed the trolls. Don’t. Feed. The. Trolls. It’s good advice. However, because of its mainstream position, I do not think Newsweek is a “troll,” even if it sure as hell is acting like one.

Go read the better news outlets for more of the story, but the short of it is that Newsweek magazine, a once-trusted source of important journalism, and still a ubiquitous presence on magazine racks across the United States, has recently stepped up efforts to be more provocative and confrontational. Just this year they ran a cover with a terrible photo of Michelle Bachmann, declared Obama “The First Gay President”, and this week, worst of all, there is the controversial image proclaiming so-called “MUSLIM RAGE”:

The reaction from so many, including lots of very smart people, has been to say Newsweek is trolling; and, thus, the appropriate reaction is to not pay attention. Remember: we Don’t. Feed. The. Trolls. Newsweek seems desperate for attention, so ignore them, and we’ll all be better off. Salon ran a story earlier today titled “Newsweek trolls again with MUSLIM RAGE”, beginning with,

Look, we all know Tina Brown, editor of Newsweek, is simply trolling us, because trolling is the only way for a weekly newsmagazine to get any sort of attention anymore.

I disagree.

One important characteristic of trolls is that they are outliers trying to hijack the mainstream conversation. Newsweek can’t derail the central discourse because it is (unfortunately) part of the center of that discourse. Newsweek can’t troll this topic any more than I can troll my own blog, regardless what I post. Those smart media pundits saying Newsweek is trolling forget the magazine is part of the mainstream conversation. Many forget that, even though they removed Newsweek from their finely tuned and abnormally selective news diets, the magazine continues to exist, racist cover and all, in front of millions paying for their groceries. Those of us following sharp journalists on Twitter and scanning political blogs forget that millions continue to either read Newsweek or at least scan its covers to get a sense of global politics. Newsweek‘s circulation is around 1.5 million according to Pew, to say nothing of their website and social media streams.

Simply, Newsweek is mainstream. And we should not ignore mainstream news outlets.

Calling Newsweek a “troll” and saying not to “feed” it with attention, even negative attention, forfeits the bigger, mainstream news conversation to these reckless news magazines. A couple of days ago there was a popular post by Joel Gascoigne that received lot of attention called “the power of ignoring mainstream news” that touched a similar nerve in me. I agree much mainstream news, including Newsweek, is problematic, but that is exactly why we shouldn’t ignore it. Don’t overlook it, critique it. Work to make it better. Don’t concede involvement with the mainstream discussion much of the country will have in order to be so self-satisfied with your own personal news diet. We have a responsibility to pay attention, be critical, and call out powerful narratives when necessary. The influence these outlets have is real.

To be clear: never feed the trolls. The fact that we need to take on (that is “feed” or give attention to) mainstream outlets like Newsweek is precisely why we should not call them “trolls.” Even if their behavior looks a lot like trolling. Trolls derail the established discourse, and, like it or not, Newsweek is part of that discourse for much of the country. By calling Newsweek a “troll,” one is excusing oneself from engaging mainstream conversations and is thereby doing nothing to make them better. It excuses us from taking on (“feeding”) these voices of power. No, pay attention to Newsweek, beat it up, and perhaps then it’ll be knocked so far from the mainstream that we can finally stop feeding them.

Number Of Users Who Actually Enjoy Facebook Down To 4

In order to be profitable, it is highly likely that Twitter can only get more annoying, Pandora can only get more interrupt-y, Tumblr can only get more cluttered, Facebook can only get more devious

Grindr officially announces its plan to mobilize gay men as a political bloc in the 2012 elections

I can’t put Twitter or the little blue bird in jail, so the only way to punish is monetarily

About four grams of DNA theoretically could store the digital data humankind creates in one year

Google Glass is changing the implicit social contract with everyone in his or her field of view

Having opened up a chasm between the informational and material, we’re rapidly trying to close it

Imagine being excited to see what the Internet looks and feels like in a new town

remote sensing and screen culture might displace today’s commonplace demand for airbuses

[Academics] quickly devolve into a game of Who’s The Best Luddite. And it is most definitely about hierarchy & power

The site, just a few weeks old and still in beta, consists entirely of videos uploaded by real people having what might be called nonperformance-like sex

human beings have not always tried to make sense of emotions through numbers

the hate-mongers who made this video and those who use the provocation as a pretext to kill are in a symbiotic, mutually reinforcing relationship

it appears that identity-based search results could be nothing more than old bigotry packaged in new media” [pdf]

Follow Nathan on Twitter: @nathanjurgenson

via The Onion

Giorgio Fontana (1981) is an Italian writer, freelance contributor and editor of Web Target (http://www.web-target.com/en/). His personal website is www.giorgiofontana.com. On Twitter: https://twitter.com/giorgiofontana.

In some very stimulating articles – mainly this one – Nathan Jurgenson has convincingly argued against what he calls digital dualism: that is, to think that “the digital world is virtual and the physical world real”:

I fundamentally think this digital dualism is a fallacy. Instead, I want to argue that the digital and physical are increasingly meshed, and want to call this opposite perspective that implodes atoms and bits rather than holding them conceptually separate augmented reality.

I’m with him: this is one of the most productive critiques I’ve read about the way we look at the digital, and I also consider this dualism untenable. But the more I focused on the issue, the more I heard an alarm bell ringing: what is exactly Jurgenson suggesting here? Is he speaking from an ontological point of view or a sociological one? What exactly does he mean by considering atoms and bits meshed together to “create reality”? Etc.

In a recent post, Whitney Erin Boesel did a kind of mind reading, summing up very nicely all my perplexities. That is, she asked everybody to stop a moment and reflect on a serious hole in the whole program:

while Team Augmented Reality does a great job of explaining the enmeshment of ‘online’ and ‘offline’, and what the difference between ‘online’ and ‘offline’ isn’t, we need to do a much better job of explaining clearly what the difference between ‘online’ and ‘offline’ actually is. While the precise nature of the difference may not need to be spelled out for those of us who already embrace an augmented reality framework, not spelling it out leaves too much room for misreadings and misinterpretations of our work. If we want to make a dent in pervasive digital dualism, we need to address this theoretical hole.

I very much agree: this is a crucial step. Luckily, I think the confusion is mainly terminological, and it just requires a closer examination of what is generally left as “understood” – but it’s not, as Boesel states. And my aim here will be exactly to make a first step in this direction.

***

The main problem, as noticed, is that in this discussion we lack a settled definition of both digital and online. Let’s begin with the first. Again with Jurgenson, the counterpart of “digital” is not “real” (they’re not opposed, as reality includes what’s digital). So, it will probably be “analogue”. But what do we mean exactly with the digital/analogue dichotomy? In my opinion, the first step is to carefully distinguish the ontological stance from the phenomenological and sociological one.

We – human, epistemic agents – experience the world as analogue. We do not perceive an array of numbered colors, but an indefinite set of shades: in our Lebenswelt there’s no place for discrete units but for continuous things and actions. But that’s a phenomenological level: what reality is in itself is rather a different question.

According to Ed Fredkin’s digital philosophy, for instance, the ultimate nature of reality is actually made of bits (and John Wheeler once summed this vision up with a very nice claim: “It from bit”). Everything in the world derives or is made by bits – a binary choice between this or that, 0 or 1, black or white. However, Luciano Floridi (whose work on information I highly suggest to read, perhaps starting from this short introduction) argues that it’s impossible to decide whether reality is ultimately digital or analogue. In his opinion with which I agree this is nothing but a new edition of the old Kantian discrete/continuous antinomy of Reason: the answer just depends on the level of abstraction from which we look at it.

Thus we can leave aside the ontological issue, and concentrate more on the sociological one. I am going to propose two working definitions for the two hottest terms.

Online is the less problematic. It refers to any activity or entity that needs a connection to internet to exist. A tree can exist without the net; a tweet simply can’t.

Digital is a bit more complicated. From a technical point of view, “digital” describes a data-based system – that is, a system whose basic level is discrete (made by single units that one cannot divide anymore). According to the classical theory of information, this basic level is a difference between 0 and 1, with nothing in between: bits. (Or informational atoms, if you like). For digital ontologists, this is the basic core of reality. But from a sociological point of view, “digital” is generally considered as related to an activity, or a part of reality, which is directly connected to some digital device. I acknowledge that it’s a very loose and broad-brush definition, but it may work as a starting point for further refinements.

In general, it’s crucial to understand that “digital” – and “online” as well – are not opposed to “real” in an ontological sense, and not opposed either to “real” in an axiological sense – “digital/online relationships are less real” – that is, less authentic.

But now I want to focus on an aspect that perhaps has been a bit underestimated: surely the distinctions online/offline and digital/analogue are getting very blurry, and they will be even more in the future, as our technology carries on pervading the world at an astounding pace. But, in my opinion, it is very important to keep these distinctions – and, most of all, to make crystal-clear the way and the standpoint (sociological, ontological etc.) from which we’re studying them. Why?

Luciano Floridi

Kill Digital Dualism, But Don’t Erase Differences

Well, at least for a methodological purpose, we may risk, in being so concentrated in demolishing digital dualism, overestimating just how enmeshed the digital and analogue are, assuming uncritically that this dichotomy is already over.

But it’s not.

There are still things which are only analogue – a flower, a death, a book, a night with a friend are analogue by themselves. And there are things which are only offline: a person who’s never entered the web, or a text that has never been transmitted by the internet. Sure, these kinds of things can encounter the digital world – for example, they can easily be narrated or augmented: a video about the night with my friend, a blog post about that sad death, a status about the book I’m reading, etc.

And maybe in the future it will harder to distinguish analogue origin from digital augmentation; with time, we may gradually and happily dismiss the distinction. But this is not a good reason to think that all things are already digital. The very copy of Rilke’s Notebooks of Malte Laurids Brigge owned by my grandfather and published in 1940 is not digital. And my grandfather himself is not digital as well. (Sure, you can observe that the fact that I’m nominating them here makes them somehow digital or online, but that sounds very tricky in my opinion).

Jurgenson is well aware of this point (see his rejoinder to Alessandro Caliandro, for instance): so this is not a critique, rather an advice to keep this difference right in mind in order not to be slaves of a brand new prejudice. (According to Luciano Floridi again, “we are probably the last generation to experience a clear difference between offline and online”. I agree. But it will take a little more time, and while we’re waiting for a complete entwining of digital and analogue, we should always try to keep them separate).

Digital, from a looser point of view, may mean all the activities and parts of one’s own identity which are primarily based on some silicon resource: computers, tablets, internet, etc. Yes, it’s increasingly difficult to make a clear distinction between what’s digital and what’s not in our daily life (especially for the younger generations) but it’s still important to do so; if for no other reason than to clarify what dichotomy we’re rejecting.

Digital and analogue, online and offline, are only going to be more and more entwined. But let’s remember that a lot of people in the world don’t even own a computer or any other digital device. Many cannot access to the internet. A proper theory of the interaction between online/offline and digital/analogue shouldn’t be so Western-centric and naive, but instead work as well in some remote village of Nigeria (or Italy as well).

 

Augmented Reality?

Finally, a word about the term “augmented reality”. I think it can be a very good label if used to reject digital dualism, making clear that digital and physical can be enmeshed to create a different and more powerful narration. But I also suggest to go a bit further: reality is reality it’s just one, and maybe the adjective augmented can be considered just as a temporary placeholder. In fact, some misunderstanding can rise if we focus on the name+adjective construction of the term: it looks like there’s “the real reality” and then we somehow “augment” or “enhance” it with digital stuff. A digital dualist would definitely agree!

Actually, Jurgenson has already faced and responded to this kind of criticism, answering to Sang-Hyoun Pahk:

we will continue to describe how reality is differently augmented by digital social media than by other technologies. This does not create a dualism of reality versus augmented reality, but instead a view of reality as always a multiplicity of augmented realities coming in many flavors. The important task is not describing if, but instead how and why augmentation occurs the way it does.

Again, I agree with him. But I still think that the term “augmented reality” may convey a kind of semantic discomfort: in my opinion, with time, we will need a new vocabulary able to preserve the uniqueness of reality and the variety of its representations and augmentations. Or at least, to ease this discomfort we could make a little bit clearer the methodology we’re using to analyze reality – which is, as I think we all agree, one and “external” to us, but also depends on our conceptual schemes – and the most of all the kind of question we’re answering when we talk of “bits vs. atoms” or “augmentation”: ontological, phenomenological, or sociological. (Again, in my opinion we can profitably use Floridi’s method of levels of abstraction here).

 

Open Problems

To sum up: while the distinction between “digital” and “real” is wrong – and Jurgenson has excellent arguments against it – I think that the dichotomy between “digital” and “analogue”, as well as the one between “online” and  “offline”, is tenable, granted that we take it as an epistemological one. “Real” can mean “belonging to reality” (ontology) but also “authentic” (a real friend – sociology, folk psychology, etc.). Jurgenson is right when he says that digital is not opposed to real in both senses: a conversation on Facebook or on Skype is not less real neither less authentic than a face-to-face one. It’s just different; really different: while rejecting the naive idea of it being inauthentic or unreal, we should also consider carefully what changes between these two ways of interacting.

The exciting thing is that this implies a wide range of open problems: philosophical (what is the epistemic status of search engines? how would be a web-extended mind?), psychological (are we sure that being hyper-connected does not affect our attention or creates a culture of distraction?), sociological (what kind of aggregation are social networks?), cultural (the new Kindle serialized fiction is “part Dickens, part TV”: what’s at stake?), political (I myself argued against the ideal of a “digital democracy” starting from the idea that blog comments are an imperfect design to fuel a good conversation). Etc.

It’s a great time to be a thinker.

“we are probably the last generation to experience a clear difference between offline and online

technologically-mediated storytelling is every bit as world-destroying as it is world-creating

75 percent of all [Wikipedia] articles score below the desired [Flesch] readability score

We all participate in this strange authorship of the now

Anonymous is reminding you that their fight will soon be your fight, if governments & corporations get their way

the answer doesn’t lie in getting paid to blog, but in relearning how to circulate our food and water as freely as our .gifs

to really understand “the Internet” we need to forget it as a unified “it” altogether

The porn industry is on the same trajectory as all media: content itself no longer holds value

Furby actually makes you want to hurt it somehow—if only it had feelings—so that you can punish it for existing

the internet hive mind might begin producing a new kind of anti-gonzo journalism

personal relationships seem to be the blurry edge of a quantified field of vision

All physical spaces already are also informational spaces

Follow Nathan on Twitter: @nathanjurgenson


via http://marlomeekins.tumblr.com/post/30482634505