Search results for twitter

From Haley Morris-Cafiero's Wait Watchers project
From Haley Morris-Cafiero’s Wait Watchers project

Last week, Hailey Morris-Cafiero, a photographer and college professor, wrote an article for Salon.com about an ongoing project, five years in the making.  Morris-Cafiero’s project is to document those who mock her because of her body size. She selects a public venue, sets up a camera in full view, and has her assistant snap photos as Morris-Cafiero engages in the world under the derisional gaze of fatphobic publics. One image shows a teenage girl slapping her own belly while intently staring at Morris-Cafiero eating gelato on a sidewalk in Barcelona; another shows two police officers laughing, as one stands behind her holding his hat above her head; a third shows her sitting on bleachers in Times Square, a man a few rows back openly laughing at her as his picture is taken.  The project is called “Wait Watchers.” more...

Cable news is dead, but something keeps animating the corpse

Human genes do not augment the body, they are the body

memes circulate us rather than vice versa

many of the declarations whizzing around Boston look like sympathy but smell like attention-seeking

social networking sites are not a separate realm of political activity

We need a multitude of what I call “Denial of Positivism (DoP)” attacks from various directions

the Google car was treated with deference no matter how recklessly we drove

iPad painting: just of the many similarities between George W. Bush and Churchillmore...

Digital Divide

1. The digital divide is so over that it’s passé

This is a common trope I hear at conferences, whether academic or otherwise.  Before presenting at the American Sociological Association annual meeting last year, I got feedback from colleagues that I should explain what in the heck the digital divide is before launching into its connection to online activism. Huh? We are sociologists – we have all read Marx. Inequality is one of the pillars that holds up our discipline. We wouldn’t know what to do without gender, class and race gaps.  Why should the Internet be any different from the rest of society?

But I’ve been told to always listen to my audience, who need a gentle reminder that digital inequality is alive and kickin.’ But what is it, exactly? more...

Dive-Bar
(This is not the dive bar in question)

I’ve been thinking a lot over recent weeks about digital media, smartphones, and absence-vs.-presence, all of which was compounded by an interesting experience I had last weekend. On one particular night, 1:00 AM found me in a Lower East Side dive bar playing pinball with a friend from Brooklyn and a friend from D.C.; I was also chatting with a third friend (who was in D.C.) via text message and Snapchat between my pinball turns, and relaying parts of that conversation to our two mutual friends there with me in the bar. More people joined us shortly thereafter, madcap shenanigans ensued and, sometime around stupid o’clock in the morning, I started the drive back to where I was staying.

As I was getting up the next day, I recalled various scenes from the night before. One such scene was from the earlier end of being at the dive bar: Getting to hang out with three people I don’t see often was a nice surprise, and how neat was it that we’d all gotten to hang out together? A few seconds later, however, it hit me that my mental picture of that moment didn’t match my memory of it. What I remembered was being in the dive bar spending time with three friends, but I could only picture two friends lit by the flashing lights of so many pinball machines. I realized that Friend #3 had been so present to me through our digital conversation that my memory had spliced him into the dive bar scene as if he’d been physically co-present, even though he’d been more than 200 miles away.

I wasn’t entirely sure what to make of this. On the one hand, yay: My subconscious isn’t digital dualist? more...

sadjifinalFacebook and Twitter, like any other form of communication, can be used to forge solidarity. As philosopher Richard Rorty reminds us in Method, Social Science, and Social Hope, one of the boundless powers of the humanities and of storytelling—novels, journalism, ethnographies, photography, documentaries—is to grow our imaginations so that the norms which would exclude foreigners, or the poor, or minorities, are replaced with a solidarity against suffering. In stories like Native Son, The Diary of Anne Frank and Brokeback Mountain, the cruelties of those who are not familiar to us are described in astonishing, bright detail. The humans who populate Dirty Pretty Things, Sin Nombre and How to Survive A Plague become less distant, more familiar. Through imagination, their suffering becomes ours. In many instances, networked media facilitate this kind of sensitivity building, this form of democratic attunement. But under the ceaseless pressure of shareability and virality, tragedy on social media often resembles disaster porn: a ghastly vine, a sappy post, attention seeking hashtags, confusing the spread of symbolic images for enduring political achievement.

That grief is best endured in groups was not lost on those involved in the Boston Marathon or to those who experienced it through networked media. more...

blogging

In this post I attempt to tackle a complex but increasingly important question: Should writers cite blog posts in formal academic writing (i.e. journal articles and books)? Unfortunately, rather than actually tackle this question, I find myself running sporadically around it. At best, I bump into the question a few times, but never come close to pinning down an answer.

To begin with full disclosure: I cite blog posts in my own formal academic writing. But not just any blog posts. I am highly discriminate in what I cite, but my discriminations are not of the cleanly methodical type which can be written, shared, and handed out as even a suggested guide.  Mostly, I cite Cyborgology and a select few blogs that I know really really well. I have done so in my last three formally published works (two of which are Encyclopedia entries), and successfully suggested blog posts to others via peer-review. When pressed for a rationale (as I have been in conversations with colleagues), I less-than-confidently ramble something like Well I mean, I know these bloggers to be good theorists, and I find their work useful for my own. Some of their work is published only in blog form, and I need those ideas to build my argument. I also don’t want to ignore something good that I know is out there. But I mean, I know there are other good things out there that I don’t know about, or don’t know enough to trust. And I know I’ve written bad ideas on Cyborgology, or ideas that I further developed later, so I guess quality is not a sure thing, but reviewers and editors have accepted it so…[insert sheepish grin].   more...

Original picture of control room from Flickr user llee_wu, edited and used by the author under Creative Commons

The very fact that your eyes rolled (just a little bit) at the title tells you that it is absolutely true. So true its obnoxious to proclaim it. Perhaps cable news died when CNN made a hologram of  Jessica Yeller  and beamed her into the “Situation Room” just to talk horse race bullshit during the 2008 election. Or maybe it was as far back as 2004 when Jon Stewart went on Crossfire and shattered the fourth wall by excoriating the dual hosts for destroying public discourse. The beginning of the end might be hard to pinpoint, but the end is certainly coming. Fox News had its lowest ratings since 2001 this year, but still has more viewers than CNN & MSNBCNEWSWHATEVERITSCALLEDNOW combined. Even if ratings weren’t a problem, credibility certainly is. Imagine if CNN stopped calling themselves the “Most Trusted Name In News” and used the more accurate, “A Little Over Half of Our Viewers Think We’re Believable.” By now it is clear that the zombified talking heads of cable news are either bought and sold, or just irrelevant. Cable news channels’ hulking, telepresent bodies have been run through and left to rot on the cynical barbs of political bloggers and just about anyone at a comedy shop’s open-mic night. This last series of screw-ups in Boston (here, here, here and unless it was avant-garde electronic literature, here) begs the question if cable news channels can even tell us what’s going on anymore. Cable news is dead, but something keeps animating the corpse. more...

On the whole, academia is quite anti-popular writing

Mainstream sociology & history have a bias towards thinking that nothing is new, ever, & thus ignored the internet

Use the emoticons & gift-wrap your message for data-miners or stick to plain English & limit yr audience to humans

I was hoping one of my cat vines was popular, not this tragedy

Think very carefully about whether tragedies belong on Vine, and about whether you should put them there

Goatse was the perfect totem for a burgeoning web culture that prized free speech and unpredictability

Jenna Marbles already embodies the future of celebrity

Digital dualism can blind us to the real and serious problems of online vigilantism

Facebook invites us to forget we even had a self before Timeline was there to organize it

the emoticon scheme makes us shoppers for new, bonus feelings à la carte

Nathan is on Twitter [@nathanjurgenson] and Tumblr [nathanjurgenson.com]. more...

genes

Are human genes patentable? At the beginning of this week, the Supreme Court (SCOTUS) heard arguments to answer just that question. Specifically, the biotechnology company Myriad Genetics, Inc. wants to defend their patent on the isolation of BRAC1 and BRAC2—two genes related to hereditary breast and ovarian cancers. Such a patent grants the company 20 years of monopoly control over the genes for research, diagnostic, and treatment purposes. A group of medical professionals, scientists, and patients are challenging the patent.

The criteria for a medical patent are such that while tools, medications, laboratory produced chemicals etc. can be patented, “Nature” cannot be patented. That which is patented must therefore be created, not merely discovered (regardless of how costly or effortful the discovery). Opponents of the BRAC patent often evoke Jonas Salk, who famously said in response to the potential patent of his Polio Vaccine: “There is no patent. Could you patent the sun?” more...

There’s something surreal about Vine. There’s something surreal about repetition, about the quality of looping. Short loops are the halfway point between still image and image in motion; they are also the spaces in which the distinction between the two breaks down. Watch a vine and watch shards, fragments of time yanked out of time and endlessly circling back on themselves, an aesthetic Ouroboros. The bland and innocuous: food, laughing friends, concerts, cats doing stupid things. People doing stupid things. You know, stuff. On endless repeat.

Explosions on endless repeat.

more...