When Sarah Palin endorsed Donald Trump, it provided political commentators with a goldmine of analytic fodder. Working through the Palin-Trump team up, there is a lot to untangle.
For instance, how do we make sense of a political climate in which the 2008 vice presidential candidate, who so damaged the presidential campaign of her running mate that he could barely mask his contempt for her on election night, is now a desirable connection?
Or what dynamics were in play that pushed Palin to Trump rather than Cruz, especially given Palin’s support of Cruz in his senate bid?
Or could her endorsement backfire, finally impressing upon moderate Republicans the urgency of nominating Rubio or Bush? And relatedly, what’s up with Rubio falling into the mainstream/moderate category?
While commentators touched on a few of these things, largely, the conversation was dominated by another topic entirely: Sarah Palin’s sweater.
To be clear, I’m not Ewwwing her sweater. In fact, I refuse to comment on her sweater. Because I’m not a sexist asshole. Rather, I’m expressing my disgust that mainstream Republicans and self-proclaimed progressives alike coalesced on social media to discuss a politician’s wardrobe—its attractiveness and appropriateness—and that this story then became the content of broadcast news. I heard CNN news anchor and political correspondent Dana Bash spend about 3 minutes on the sweater, its other appearances (CBS Sunday in November), and how the sweater fits into Palin’s larger repertoire of wardrobe choices. Mashable ran a story on the price of the sweater. And the Washington Post gave the sweater an in-depth analysis. Ewww.
Here’s the thing. Social media are heralded as a democratizing force. The voices of the people flow into the voices of broadcasters, spreading into national and international conversations in ways previously unavailable. Social media are a mechanism for the people to speak in ways that projects out into public life. This is powerful. This has been a major tool in important revolutionary movements from the Arab Spring, to #Occupy, to #BlackLivesMatter. It’s how the public insisted upon a serious discussion of interpersonal violence following the infamous Ray and Janay Rice incident. It’s where Hilary Clinton’s evocation of 9-11 to account for her ties to Wall Street were noticed, critiqued, and given back to her with striking immediacy.
But we don’t live in a media democracy. Broadcast media still have disproportionate control over what does and does not become news. Social media provide content pools, which broadcast outlets sift through to select what they will or will not, address. This means the relationship between social and broadcast media is of a curatorial nature, with broadcasters maintaining more power than those from whom they pull stories. This makes broadcast outlets responsible parties. When they don’t pick up an important story, they are accountable. When they do pick up a story that should have remained in the ether, they are also accountable. Sarah Palin’s Sweater was a story that should have withered and died.
Commenting on professional women’s clothing is sexist. It highlights fashion and beauty while backgrounding substance. It wasn’t okay with Hilary Clinton’s pants suits and it’s not okay with Sarah Palin’s sweater. It is disappointing if not surprising, however, that sexist commentary arises among the populace. It is unacceptable when that sexism translates into a headlining story.
Turn on your TV and I bet you can find a show about Alaska. A partial list of Alaska-themed reality shows airing between 2005 and today includes Deadliest Catch, Alaskan Bush People, Alaska the Last Frontier, Ice Road Truckers, Gold Rush, Edge of Alaska, Bering Sea Gold, The Last Alaskans, Mounting Alaska, Alaska State Troopers, Flying Wild Alaska, Alaskan Wing Men, and the latest, Alaska Proof, premiering last week on Animal Planet, a show that follows an Alaskan distillery team as they strive to “bottle the true Alaskan spirit.” And with Alaska Proof, I submit that we have saturated the Alaskan genre; we have reached Peak Alaska. We may see a few new Alaska shows, but it’s likely on the decline. I don’t imagine we have many Alaskan activities left yet unexplored.
Television programming remains a staple of American Leisure, even as the practice of television watching continues to change (e.g., it’s often done through a device that’s not a TV). As a leisure activity, consumers expect their TV to entertain, compel, and also, provide comfort. What content and forms will entertain, compel and comfort shift with cultural and historical developments. Our media products are therefore useful barometers for measuring the zeitgeist of the time. Marshall McLuhan argues in The Medium is the Messagethatupon something’s peak, when it is on the way out, that thing becomes most clearly visible. And so, with Alaska peaking in clear view, I ask, what does our Alaskan obsession tell us about ourselves?
In the 1980s and early ‘90s, the family sitcom reigned. These years held the ideological pinnacle of neoliberal individualism. Materialism ruled, regulations waned, and shoulder pads helped each American take up a little more space. By my count, Time’s2006 designation of “You” as the person of the year was about two decades late. Around this time, we also saw a quickly changing family structure. Divorce was on the rise, family size on the decline, and dual incomes becoming increasingly necessary and normative.
With anxieties surrounding shifting family life, it’s no surprise that the Full House/Family Matters genre rose to prominence. People wanted to sink into their couches to enjoy 30 open and closed minutes of close-knit characters who cared for each other and solved disputes with a quiet knock on the bedroom door, some short self-reflexive dialogue, and a warm hug, finally relieving the tension with a shucks-worthy joke. It was a dose of wholesome. Warm and safe like a diagonal cut grilled cheese and tall glass of milk.
Today, we’ve moved beyond the formulaic family comedy. We want complex characters and believability. We expect continuity and semiotic ambiguity, the kind of programming that spurns debate and post-show discussions with fans, creators, and actors. Or alternatively, we want voyeuristic satisfaction, long-form documentaries in the form of 50 minute segments spread across 12-16 episodes. But that doesn’t mean we’ve stopped seeking comfort. We still need our entertainment to offer reprieve from the troubles that worry us in everyday life, the concerns that accompany societal change. We still want fantasy and escape, even while demanding a realist lens.
The most obvious of contemporary shifts comes in the form of digitization and automation. We are the digital revolution, and people aren’t sure what that means, but they know things have changed and will never again be the same.
Our conversations needn’t require saliva. A day of work may elicit tears, but rarely blood or sweat. Dirt under the fingernails is more likely to originate in a community garden than on a factory floor or family farm. Our muscles may be sore, but mostly from yoga, and we can soothe ourselves with a scented bath and monthly massage membership. The gritty physicality of Alaska shines brightly against the sterile experience of everyday life here on the mainland. We are clean, and soft—at least those of us with disposable time to binge watch and disposable income to buy the commercial goods that drive programming decisions— and we are pretty sure this means something has been lost.
To be clear, our use of technology is far from actually clean. It’s actually incredibly dirty, requiring intense physical labor and causing extreme environmental strife. But we don’t want to know about the dirt and we don’t feel it in our everyday lives. And so we watch Alaskans. Or at least our fantasy of Alaskans. Those majestic creatures unsoiled by streaming News Feeds or celebrity gossip.
We watch them toil on the land. Hunt for their food and can jars of honey. We watch them barter instead of buy, and fashion Christmas gifts out of beaver parts. We watch them do things that most of us have not the skill, opportunity, nor need to practice.
Alaskans are a specimen of anthropological fascination. They give us back to ourselves as we never were, and know we never will be. They are a vestige of the rugged individualism that drives the American value system. They ease us with their living-off-the-land, even as we livetweet their experiences. We watch with reverence, our heads titled in slight confusion. And we watch desperately, for fear that these relics, these strange exemplars of the simple life, where a hard day’s work is its own reward, will remain only as part of the historical record.
As a rule, parents tend to experience concern about their children’s wellbeing. With all of the decisions that parents have to make, I imagine it’s near impossible not to worry that you are making the wrong ones, with consequences that may not reveal themselves for years to come. This is why recommendations from professional organizations are so important. They offer the comfort of a guiding word, based presumably in expertise, to help parents make the best decisions possible.
The American Academy of Pediatrics (AAP) is one such organization, and they have some things to say about children and screen time. However, what they have to say about children and screen time will be revised in the next year, and no doubt, many parents will listen intently. NPR interviewed David Hill, chairman of the AAP Council on Communications and Media and a member of the AAP Children, Adolescents and Media Leadership Working Group. Although Hill did not reveal what the new recommendations would advise, the way he talked about the relationship between screens and kids did reveal a lot about the logic that drives recommendations. The logic that Hill’s interview revealed made clear, once again, the need for theory to inform practice. More specifically, those who make practical recommendations about technology should consult with technology theorists.
The sneaky thing about assumptions is that they inform our way of thinking without letting on that they are doing so. A theorist is trained to identify underlying assumptions and question how they shape proceeding action. From Hill’s interview, three assumptions stand out: screens are a singular category; “screen time” is inherently harmful; and digitally mediated play is distinct from analog forms of play, the latter more “real” than the former.
The most glaring (and easiest to problematize) assumption is that screens occupy a singular category. Recommendations don’t apply differentially to ipads, televisions, or phones, let alone to the immense diversity of media content that each piece of hardware hosts. Of course, condensing screens into a singular category is likely done for reasons of parsimony—busy parents don’t have time to read nuanced recommendations about the full variety of hardware, software, and content available. But that sheer volume of different kinds of screens/ways of using them should perhaps give pause to anyone attempting to give recommendations about them as a categorical unit. Maybe sweeping recommendations aren’t going to be particularly useful.
A second assumption is that screen time is inherently harmful. In the interview, Hill sets up a debate between control over screen time and wholesale screen abstinence, using food and tobacco as the competing metaphors:
The question before us is whether electronic media use in children is more akin to diet or to tobacco use. With diet, harm reduction measures seem to be turning the tide of the obesity epidemic. With tobacco, on the other hand, there really is no safe level of exposure at any age
Although Hill takes the more moderate “food” approach to technology, this still presumes that technology is a dangerous thing to be managed. The assumption of compulsory harm constructs a debate between avoidance and minimization; it depicts technological advancement as an unfortunate juggernaut with which we are forced to contend. Such a depiction robs technology of the opportunity to be good, and eliminates research and recommendations into the complex conditions under which various media are harmful or beneficial, how, and for whom. Further, it constricts the measure of harm and benefit to metrics rooted in analog styles of learning and development. Studying the effects of new media upon old ways of thinking makes little sense, but it does not appear that the AAP is considering the dynamism of the human mind as it adapts to changing cultural realities. A new kind of world requires different kinds of thinking and different kinds of skills. Spelling is less important, information sorting is more important. Intense focus on a single task is needed sometimes, other times it pays to spread attention among multiple and fast-moving targets.
Finally, the AAP seems to operate under the assumption that digital is distinct form physical, and that digital has less substance. In hazarding a preliminary recommendation for parents, Hill advises:
If [children] color or read or play basketball or ride their bikes, take some time to ask them about what they’ve done and why they enjoyed it. These conversations will help them focus on the joys of the “real” world, and they will notice that their activity attracts your attention.
This advice is a clean and clear example of digital dualism, a fallacy we regularly point out and critique on this blog. Concisely, digitally mediated play is equally real to non-digitally mediated forms of play, and both digital and analog play can co-occur.
Hill and the AAP offer an important service. This is why I implore them, and others who help people navigate digitally infused terrains, to talk with people who think about these issues more broadly. A theorist of technology brings a critical eye that can be of use to those with particular empirical expertise, who wish to take action of practical concern. In this case, a technology theorist might offer some solace to parents…your kids are probably going to be okay. Don’t fret. Pay attention to them and pay attention to what they do. Screens aren’t the enemy.
Marvel’s Jessica Jones is a dark and reluctant hero. An alcoholic private detective, Jones’ super-human physical strength remains largely underutilized when we meet her in the Netflix series opening episode. As the story unfolds, we learn that Jessica self-medicates to deal with a traumatic past in which a man named Kilgrave, who controls people with the use of his voice, held Jessica captive as his lover while forcing her to engage in violence and even murder. Their relationship ended when Jessica was finally able to resist his control—a quality unique to her—and Kilgrave was hit by a bus, leaving him presumably dead. The storyline of the first season is premised on Jessica learning that Kilgrave is still alive, has captured another victim, and is coming to reclaim Jessica. In turn, Jones hunts for Kilgrave to ensure that he dies, once and for all.
About halfway through the season Jessica realizes that Kilgrave is tracking her whereabouts by controlling her friend and neighbor Malcom Ducasse. To wrest Malcom from Kilgrave’s control, Jessica strikes a deal. She agrees to send Kilgrave a selfie at precisely 10am each day. At his direction, Jones even includes a smile.
Jessica Jones’ selfie is a significant cultural artifact. With super-human physical brawn and impenetrable emotional toughness, Jessica Jones is an icon of strength. Jones’ image—how it looks, who it’s for, and how it’s produced— represents the potential of feminist self-documentation. It therefore shines light on what a selfie can do given the tangled relationship between feminism and patriarchy in self-documentation.
The selfie has become a key battleground for gender politics in a digital age. Although front-facing cameras are for everyone, cultural tropes most often place them in the hands of women. The selfie then becomes a vehicle for the critique of femininity. The selfie-posting woman is vapid, needy, and hungry for Likes. As Anne Burns explains:
Beyond a critique of photographic form or content, the online discussion of selfies reflects contemporary social norms and anxieties, particularly relating to the behavior of young women. The knowledge discursively produced in relation to selfie taking supports patriarchal authority and maintains gendered power relations by perpetuating negative feminine stereotypes that legitimize the discipline of women’s behaviors and identities.
Combating the haters, feminist media commentators and scholars (like Burns) offer alternative readings of the selfie as an expressive cultural form. Counter readings of selfies generally take two tracks: Concern (We’re Fucked) and Confidence (Fuck You).
Concerned feminists worry about the meaning of selfies. Selfies are not an indictment of those who post them, but of a culture in which the worth of women and girls continues to hinge on sexual desirability. The selfie then signifies complicity in patriarchal reproduction. That is, selfies are a consequence of women’s pervasive subordination. In this vein, Erin Gloria Ryan at Jezebel calls selfies a “cry for help” and rejects any notion that selfies are good for women:
Selfies aren’t empowering; they’re a high tech reflection of the fucked up way society teaches women that their most important quality is their physical attractiveness.
The Confidence crew, however, insist that turning the camera on the self is a way to take control of one’s own image, lest that image remain captured and configured by an amorphous, patriarchal, controlling, Other. The selfie is a source of Gurl-Power. For Amy McCarthy, selfies are a political act of both feminist strength and personal confidence, especially for those who don’t fit normative body ideals:
When you look in the media… there are very few examples of fat, trans, or dark-skinned women. As such, selfies present themselves as a way to make our bodies visible, and in a radical way. So you take your selfies, peeps — they’re one way to say “fuck you” to the body standards that have made us miserable for so long.
Jessica Jones’ selfie, at once an expression of agency and subservience, is a microcosm of the selfie phenomenon more generally. So tell us, Jessica, what does it mean to take a selfie? Is it empowering or is it self-inflicted oppression? The answer, of course, is “yes,” It is both.
By turning the camera front facing, Jones freed herself from external surveillance, freed Malcom from Kilgrave’s service, and took power over her own image. When watching is ubiquitous, showing becomes the agentic option. While surveilled through Malcom, Jessica could be photographed in any moment. The surveillance was potentially everywhere, all the time. As Foucault so clearly illustrates, potential surveillance is a powerful mechanism of control. When the surveilling eye remains hidden, all moments are documentable and therefore never entirely one’s own. With the selfie, Jessica purchased privacy. All of the non-selfie moments were once again, hers. When she did self-document, Jones selected the timing of this documentation and configured her body and face in a manner of her liking. She could then review the images and select which to send. In a word, the selfie enabled Jessica to document with intention. We see this intentionality manifest in Jones’ masterfully accomplished Fuck You smile.
Yet, despite its Fuck You quality, Jones still smiles, as per Kilgrave’s request. She still sends him pictures, at the time he instructs (10am). She still operates, ultimately, under Kilgrave’s gaze. Jones’ life, and the lives of those she cares about, depend on her compliance. The selfie buys Jones freedom, but within heavy confines.
It is not until the final episode of Season 1 that Jessica untangles herself from Kilgrave entirely. This disentanglement comes when she kills him. And killing Kilgrave is perhaps the perfect metaphor. The selfie is empowering, given a persistently oppressive arrangement. The selfie as a cultural artifact is both a product of and response to gender relations. As long as women are objectified, turning the camera on the self is a means of intentionality. It takes the power from the other and places it within the self. Artist and subject become one. But only by killing patriarchy—and the sexual confines, normative beauty standards, and persistent microaggressions patriarchy entails—are women and girls truly free. Within patriarchy, the selfie will always carry the weight of feminism upon its shoulders.
This is the year of #BlackLivesMatter. In response, it is also becoming the year of White Supremacy. It’s not that Black Lives didn’t matter before, nor that Whiteness didn’t reign supreme. Rather, dramatic and highly publicized incidents of violence against Black citizens by those charged with protecting them have created a cultural dynamic in which the value of Black Lives and the respondent assertion of White Supremacy, have reached a point of articulation.
When you clean house, the roaches emerge. As a nation, we are cleaning house, finding and scrubbing out the blaring and hidden spots of racism, many of which have seeped deep into the layers of our social fabric. A White Supremacist presence is therefore unsurprising. The Supremacists wriggle out in defense of their comfortable home that the elbow grease of mobilization threatens to upend. They are gross but expected. However, their pervasiveness and seeming capacity to garner sympathy, is less expected.
The affordances and dynamics of social media tell an important part of the story…
Counterclaims about both Black Lives and White Supremacy are facilitated by social media platforms that afford the formation of issue driven groups with the capacity to commiserate, strategize, and spread a unified message. Such was the process by which the successful movement at Mizzou, in which Black students and allies mobilized to achieve administrative resignations along with policy and curricular changes, translated with near immediacy into similar movements across college campuses in the U.S. #StandwithMizzou became a rallying cry for students who wished to affect real change in their own schools’ racial climate.
These very same processes are also those that currently facilitate the fast formation of White Student Unions, reactionary groups created by and for “White students and allies” who fear the loss of White’s voices and decimation of White culture. Although administrations are quick to denounce the groups as unassociated with and unsanctioned by the universities to which they are connected, the groups are nonetheless collectivities of people, most likely students, who gather to assert their White Power. The group that popped up at my university describes themselves as follows:
Rather than identifying as White Supremacists, the profiles operate under the gauze-thin cover of European identity. These are White European students, emboldened to speak their (racist and ethnocentric) truth.
The question, then, is from whence does such boldness arise? Given the widespread demonization of Whiteness identity groups in the U.S.—especially the KKK—how do a bunch of 20 year olds come to think it’s viable and acceptable to form a group around White heritage? And while we’re asking, how do four adult men, in 2015, identify as White Supremacists, scream racial slurs, and shoot into a crowd of protestors? This is the part of the story that social media doesn’t fully capture.
Racist collectives, with their communities, identities, and calls for action, form and flourish with the symbolic aid of highly visible, highly powerful, and highly influential figures, given voice through America’s political institution.
White Supremacist messaging, though animated by grass roots social media groups, is undergirded by the rhetoric of those in the highest positions of power. For instance, FBI director James Comey, who blamed #BlackLivesMatter protestors for creating a hostile environment in which police officers are disinclined to intervene, lest their actions get filmed and critiqued. Or the list of governors who (are trying to) refuse Syrian refugees entry into their states. And of course, Donald Trump, who not only represents Whites who are concerned with the slippage of their power, but like Comey and the governors, legitimates the White Supremacy position.
Comey and the anti-refugee governors foil their racism in security concerns. This breeds hate and exclusion, but can ostensibly diminish with time and data, or at least migrate to new groups as new moral panics emerge. Trump’s hate is stickier. It plays more to the long game. It’s based on a feeling—an intuition of discomfort–and its expression is attractively packaged in authenticity.
Authenticity is an ingrained American value, all the more valuable among politicians from whom we are accustomed to, but sick of, seeing precisely calculated performances. Trump is the candidate who purportedly “tells it like it is,” who isn’t enchained by political correctness. He is the folk hero who promises to “make the country great again!”
With the valor of authenticity, Trump says things like “The wall will go up and Mexico will start behaving.” He insists that Muslim Americans in New Jersey were celebrating after 9/11—a claim he vehemently defended by mocking a reporter with a congenital joint condition. He also tweeted statistics about race and murder that are entirely fabricated, imply that Blacks are disproportionately violent, and relies on “thug” iconography. And he continues to do great in the polls.
By wrapping their hate in safety and “truthfulness,” these leaders gift racists their righteousness. Trump’s inflammatory rhetoric does more than collect support, but also provides a moral position and an accompanying narrative on which to carry messages that would otherwise be unpalatable. Messages like those from White Student Union organizers, or anti-refugee protestors, or Men’s Rights Activists, or assholes on Yik Yak.
Openly racist leaders make it acceptable for everyday citizens to hold racist views and viable for everyday citizens to engage in racist acts. When the FBI director worries about police safety and effectiveness and a presidential candidate shouts about Mexicans, racism becomes an expression of morality–one of safety and truth– an expression that spreads and takes hold on the digital platforms of everyday life.
Thanksgiving brings with it the compulsory advice and opinion pieces about how to manage uncomfortable conversations, audacious behavior, and embarrassing reminiscence that so often come with large family gatherings. However, these columns leave out a highly effective and likely widespread coping mechanism: the snarky text.
The snarky text is a surreptitiously crafted message in which the sender factually reports on the ongoings around them and accompanies the report with pithy commentary in written or pictoral form. Gifs are particularly effective. Snarky texts work best when the recipient is adequately primed about the dynamics in which the sender is ensconced. Recipients may be remote, or for an added layer of complexity, may be in the same location as the sender.
For example, when your grandmother’s “friend” makes erotically suggestive statements while deciding which cut of the turkey he wants, you can send your favorite cousin, sitting across the table, this:
Or when your family coalesces to comment on your weight and food choices, you can send this to your best friend across the country:
“Oh good. I hoped everyone would spend 45 minutes noticing my vegetarianism and its ill effects upon my body.”
The snarky text is a unique communicative tool. It not only provides an outlet for in-the-moment frustrations, thus neutralizing the experience of your great aunt reminding you of your withering reproductive system, but also turns bad experiences into good ones. Those uncomfortable or painful moments become fodder for your account. The worse the better, really. When a relative proclaims we must “make our country great again!!” it is joyful for you in its potential for documentation. Who should I tell, and in what expressive form!? You ask yourself.
Of course, families are not always terrible. For the record, my family is pretty unterrible. But in general, people are annoying and life can be trying. The snarky text puts a giggle into those difficult moments.
Today is a big day in Columbia, Missouri where the the University of Missouri system president Tim Wolfe resigned amidst protests over his longstanding failure to address racial issues. Led by #ConcernedStudent1950, named for the first year Black students were accepted into the university, campus protests have been ongoing for several months. Early last week, graduate student Jonathan Butler went on a hunger strike, followed by football players boycotting their athletic labor and catapulting the story into public discourse. Things came to a head this morning with Wolfe’s announcement. Of course, I went immediately to the Columbia, Missouri Yik Yak where I refreshed compulsively.
Major themes are represented below. They include claims to reverse racism, colorblind inspired claims that racial protests create racial divides, allusions to South Park (a LOT of them, and I don’t think nostalgically…so hey there, 2001), attempts to discredit protestors as bullies or babies, attempts to discredit protests as unjust disruptions of the University’s academic function, and for good measure, panda facts and other tidbits from people for whom this is just another day.
In contrast, Twitter has been largely (though not exclusively) celebratory, containing messages of solidarity and momentum building:
We also see people on Twitter citing Yik Yak as proof of the racial problem:
The difference in feel and content between Twitter and Yik Yak likely hinge on platform affordances and relatedly, the user base. The two platforms differ in geographic scope, level of anonymity, and algorithms of visibility. Twitter is international in scope, while Yik Yak is locally tied. Twitter users are often identifiable, whereas Yik Yak users are not. Twitter provides both time and popularity based feeds, as does Yik Yak. However, a score of -5 will delete a Yak, whereas there is no way to downvote a tweet and one needs to lodge a formal complaint to have a tweet removed. Twitter therefore represents a broader swath of the population, held accountable for their content. In contrast, Yik Yak represents the local community, who can speak without identification and the repercussions that identification entails. Where racism exists, Yik Yak opportunes its expression. This is what we see in Columbia, where a field of highly racist content and the affordance of up and down voting make it unlikely that protest supporters on Yik Yak will rise to “Hot” status, and exceedingly likely that their voices will dissipate from the feed altogether.
I moved to rural Kansas a over a year ago. I live beyond Lawrence city limits, on the outskirts of Stull (where local legend places one of the gateways to hell), and 50 minutes driving to the nearest Google Fiber connection. It’s a liminal space in terms of broadband connection – the fastest network in the country is being built in the neighboring metropolitan area but when I talked to my neighbors about internet service providers in our area, they were confused by my quest for speeds higher than 1mbps. As this collection of essays on “small town internet” suggests, there’s an awareness that internet in rural, small town, and “remote” places exists, but we need to understand more about how digital connection is incorporated (or not) into small town and rural life: how it’s used, and what it feels like to use it.
One of my ongoing projects involves researching digital divides and digital inclusion efforts in Kansas City. The arrival of Google Fiber in Kansas City, KS and Kansas City, MO has provided increased momentum and renewed impetus for recognition of digital divides based on cost, access, education and computer literacy, relevance, mobility, and more discussion and visibility for organizations and activists hoping to alleviate some of these divides and emphasize internet access as a utility. I’ve argued that by reading digital media in relationship to experiences of “place,” we gain a more holistic and nuanced understanding of digital media use and non-use, processes and decisions around implementation and adoption, and our relationships to digital artifacts and infrastructures. In other words, one’s location and sense of place become important factors in shaping practices, decisions, and experiences of digital infrastructure and digital media.
The irony is not lost on me that while studying digital divides in a metropolitan area, I had chosen to live in a location with its own, unique series of inequities in terms of internet connection. These inequities have nothing to do with socio-economic instability or lack of digital literacy, as I had funds and willingness to pay a significant amount for internet service (comparable to the prices charged by urban-based, corporate ISPs), and everything to do with the fact that I lived in an area that felt as if it had been forgotten or intentionally bypassed by the internet service providers (ISPs) I had come to know living in other US cities and towns.
In this essay, I want to recount a few of the ways that my relationship to internet infrastructure and ISPs has changed since moving out to the country. (My relationship to social media and my social and economic dependence on internet connection has shifted as well, which I plan to write about elsewhere.) I’m speaking to my experience of digital connection and digital practices “after access,” from within a certain type of digital connectivity. I don’t claim that these interpretations or experiences are generalizable or representative, but they’re some of my initial observations having been an ubiquitously connected, digitally literate, urban dweller for the majority of my life and now living the last year and a half of residence in a rural place.
After moving in, I realized that although our house was advertised as having “high speed internet,” this didn’t mean a wired, cable broadband connection or even DSL, as we weren’t in either of these coverage areas. An internet connection meant that we could connect via two strictly data-capped options: satellite, 4G provided by a cell phone company, or a pay as you go 4G connection. Various blogs and forums hosting threads on ISP options overflowed with warnings about the high prices, data caps, and unreliability of satellite internet connections in rural environments and otherwise.
I posted on social media outlets and contacted friends about my frustrations with my internet access options and received suggestions to contact the cable company and ask them to expand their service to our area, offers to come to friends houses to use the internet, and empathy from people who grew up in rural areas sending condolences for the fact that I would never binge watch anything again. It might sound frivolous to some, but I admit that the thought of not being able to stream anything ever, Skype or share photos with friends and family members, and difficulty downloading large files did make me panic. I’d rather not fall victim to varieties of information, participation and culture gaps and I regularly need to stream, upload and download large files in order to do my job.
The local cable monopoly first offered us service over an old Motorola Canopy network at a maximum of 1mbps upload and download speeds. I had never consciously thought about the sheer amount of emails I received that included or requested attachments until I was unable to send one consistently from my home computer. Before the end of the first week the sound of an email arriving in my inbox while I was at home made me anxious. It meant that I would have to wait until the next time I was in town to respond with a comment other than, “I can’t send the attachment until tomorrow, I have limited access to the internet right now,” a euphemism which frustrated me and I thought made me sounds like a slacker. I cancelled the service after the two-week trial.
Now, I love my internet service provider, which is something I never thought I’d ever say. I have feelings of gratitude for them. They’re a local company who, according to their mission statement, saw “a lack of adequate Internet service options available to rural Northeast Kansas communities” and decided to build their own point-to-multipoint, line of sight network to service to our area. In 2008, they acquired another local ISP owned and operated by an area high school and later migrated their network from Motorola Canopy to 4G. They retrofitted the Canopy network antenna that the previous owner of our house had left, installed a 6 foot pole antenna on the roof of our house, and located a direct line of site to one of their towers. We now average around 5 mbps upload and download speeds. Although we experience noticeable lag time as compared to our workplace connections, and Skype, VoIP, and streaming often crash due to poor internet connection – we have a generally reliable connection with no data caps and at less than half the cost of any service provider in town.
This type of internet connectivity looks and feels different as well. The equipment that powers my connection demands more conscious and haptic attention. The pole and antenna mounted to my roof are taller than the rest of the house and are the first things you see from the driveway. I can see part of the tower that powers my internet, as well as two others that use the canopy network, across the prairie. I have to tend to my equipment. I often have to touch the antenna and pole to adjust them after being blown by strong winds and I’m regularly unplugging and pushing buttons to reset the router. The “seamfulness” of the experience makes me think about the “wires” and wireless frequencies, how they work or don’t work and why, in a way I never did while living in cities. For me, the infrastructure is very tangible and visible, which makes me think of myself as a digitally connected person more than ever before. I feel more connected to my connection, and more responsible for making it work.
I’ve wondered about the potential for mesh networks in my rural area. Mesh networks are decentralized, redundant, often inexpensive networks powered by antennae that act as both access points and routers, repeating wireless signals in a mesh-like configuration. In conversations with digital inclusion activists and community network organizations in urban areas, mesh networks are often suggested or already serve as a powerful alternative to more traditional ISPs and the networks they provide. However, the technical problem of distance persists as houses, barns, silos, garages, and other structures where antennae might be mounted can be over several miles away. More complicated is the fact that the pre-existing social structures and norms around proximity and sharing are also very different from cities or more densely populated areas. People who live out here tend to live “alone together.” I live closer to and encounter my neighbors’ cows, dogs, goats, and chickens than the people who own them, and where minimal (albeit friendly) interaction between people is the norm. There’s not much we share in terms of services and utilities: we pay for utilities individually, often from different service providers. The area is purely residential for miles and the commercial and family farms and orchards don’t have direct sales on premises. In many ways each household feels like a self-sustaining unit with their individual tanks of propane, tornado shelters, livestock, and food crops. I often wonder how introducing an infrastructure built on shared internet connection would mesh with these pre-existing social networks. But at the same time, I wish someone would propose a network like that out here, or finally send up those balloons.
Germaine R. Halegoua is an Assistant Professor at the University of Kansas interested in the ways we experience place through digital media and vice versa. She tweets occasionally @grhalegoua and also contributes to Antenna and Social Media Collective.
I am an invisible man. No, I am not a spook like those who haunted Edgar Allan Poe; nor am I one of your Hollywood-movie ectoplasms. I am a man of substance, of flesh and bone, fiber and liquids — and I might even be said to possess a mind. I am invisible, understand, simply because people refuse to see me. Like the bodiless heads you see sometimes in circus sideshows, it is as though I have been surrounded by mirrors of hard, distorting glass. When they approach me they see only my surroundings, themselves, or figments of their imagination — indeed, everything and anything except me…It is sometimes advantageous to be unseen, although it is most often rather wearing on the nerves. Then too, you’re constantly being bumped against by those of poor vision…It’s when you feel like this that, out of resentment, you begin to bump people back. And, let me confess, you feel that way most of the time. You ache with the need to convince yourself that you do exist in the real world, that you’re a part of all the sound and anguish, and you strike out with your fists, you curse and you swear to make them recognize you. And, alas, it’s seldom successful… ~Ralph Ellison (1932), Invisible Man
In what follows, I argue that the Black Lives Matter movement is a hacker group, glitching the social program in ways that disrupt white supremacy with glimpses of race consciousness. It is a group that combats black Americans’ invisibility; that “bumps back” until finally, they are recognized. As Ellison continues:
Invisibility, let me explain, gives one a slightly different sense of time, you’re never quite on the beat. Sometimes you’re ahead and sometimes behind. Instead of the swift and imperceptible flowing of time, you are aware of its nodes, those points where time stands still or from which it leaps ahead. And you slip into the breaks and look around.
The Black Lives Matter movement brings us, forcefully, into the “breaks,” and invites us to look around, too.
To hack is to find and exploit the weaknesses in a system. Once found, hackers can gain access to what’s inside, and, if desired, change the programming. The Black Lives Matter movement is working to accomplish the latter. They expose racism among America’s most established institutions, and then disrupt the fabric of everyday life to bring these weaknesses to the attention of the masses. This disruption or “glitch” that activists—especially activists of color— present is in many cases, simply themselves. They are black bodies taking up space; black bodies making demands; black bodies resisting invisibility.
Earlier this year, Black Lives Matter activists took over Baltimore. Sitting peacefully, marching the streets, and alternatively, breaking windows and setting things on fire in protest of the deadly police brutality inflicted upon Freddie Gray. Police deployed tanks. Officials closed schools. Businesses were unable to operate. Glitch: Look at us.
In Ferguson earlier this week, Black Lives Matter activists blocked the entrance to the St. Louis Federal Courthouse and traffic on a major highway in protest and remembrance of Michael Brown, the unarmed black teen killed by a white police officer one year ago. The city declared a stated of emergency and arrested close to 60 protesters, including high profile activists like philosopher Cornel West. Glitch: We are still here.
In Seattle last week, two black women activists stormed the stage at the Social Security Works rally in Westlake Park, prohibiting white presidential candidate Bernie Sanders from speaking. Lamenting Sanders’ failure to address contemporary racial issues, the women were booed by the crowd but refused to give the microphone back. They invited Sanders to respond to their criticisms. He declined. Following the event, Black Lives Matter Seattle released a press statement in which they proclaim: “…we honor Black lives lost by doing the unthinkable, the unapologetic, and the unrespectable.”
The choice of Sanders as a target is of particular relevance. Sanders is a self-described ally with a strong record of civil rights activism. In fact, just hours after his failed attempt to speak at Westlake, Sanders addressed a crowd of 15,000 at the University of Washington calling for an end to institutional racism and reform of the criminal justice system. In contrast, Donald Trump claims there will be no “black presidents for awhile” following what he considers a botched job by Barack Obama, and Ben Carson believes we needn’t think of race because he knows deep down that brains, not skin, make us who we are.
Bernie isn’t perfect, but he’s far better than the rest. And that’s just it. His work, his almost anti-racist position, his good intentions and barely missed marks make him the lowest common denominator within the existing political system. This is a system that puts black lives alongside a suit of issues—environment, economy, tax policy, military funding. This is a system that hides race issues amongst the crowded tabs of candidates’ official web pages. The Black Lives Matter movement rejects this model. Instead, it insists that in this moment, Black Lives take center stage. Anywhere but the center is unacceptable. No more hiding in plain site. Glitch: We are taking over the platform.
Because of this insistence upon centrality, Black Lives Matter refuses to be Anonymous. They do not disrupt the system quietly. The hack is their presence. The hack is their voices. The hack is their faces. It’s not about discourse or even policy, but an insistence upon visibility; a refusal to remain unseen.
Like any good systems maintenance crew, however, the U.S. social system has workers diligently laboring to quiet the glitches, to restore the program, to punish the hackers and reinstate their invisibility. In Ferguson last year, these workers made up the grand jury that chose not to indict police officer Darren Wilson, the man who killed Michael Brown. This week, the workers are the “Oath Keepers,” made up of five white men with weapons, patrolling the streets of Ferguson to maintain “order” and “peace.” In the media, these are the news stations that label protestors “rioters” and highlight the destruction of property while marginalizing the historical and systemic destruction of black lives. It is Bernie Sanders, who pouts at his lost stage time rather than step aside to graciously acknowledge that this moment is not for him.
But the Back Lives Matter hack is powerful in its persistence. The system has been weakened by cameras on cops, fires in the streets, citizens demanding answers, and feet stomping on the ground, day after day, month after month. And because of this persistence, it is a hack that the system can only fight for so long. Each protest-induced glimpse makes invisibility more difficult to restore. At some point, we will have all seen too much, even those who try to close their eyes. This war of glitches creates a tumultuous moment, but provides the code with which to write an alternative future.
Authenticity is a tricky animal, and social media complicate the matter. Authenticity is that which seems natural, uncalculated, indifferent to external forces or popular opinion. This sits in tension with the performativity of everyday life, in which people follow social scripts and social decorum, strive to be likeable—or at least interesting—and constantly negotiate the expectations of ever expanding networks. The problem of performance is therefore to pull it off as though unperformed. The nature of social media, with its built-in pauses and editing tools, throw the semblance of authenticity into a harsh light. Hence, the widespread social panics about a society whose inhabitants are disconnected from each other and disconnected from their “true selves.”
For political campaigns, the problem of authenticity is especially sharp. Politicians are brands, but brands that have to make themselves relatable on a very human level. This involves intense engagement with all forms of available media, from phone calls, to newspaper ads and editorials, to talk show appearances, television interviews and now, a social media presence. The addition of social media, along with the larger culture of personalization it has helped usher in, means that political performances must include regular backstage access. Media consumers expect politicians to be celebrities, expect celebrities to be reality stars, and expect reality stars to approximate ordinary people, but with an extra dab of panache. The authentic politician, then, must be untouchable and accessible, exquisite and mundane, polished yet unrehearsed. Over the last couple of elections, social media has been the primary platform for political authenticity. Candidates give voters access to themselves as humans—not just candidates—but work to do so in a way that makes them optimally electable. It’s a lot of work to be so meticulously authentic.
This is why political authenticity requires robust PR teams. Political campaigns are hyperperformative, making the politician’s image of authenticity spectacularly calculated. The Clinton campaign includes marketing experts from Coca-Cola and the advertising agency GSD&M; Jeb Bush’s team includes a full media staff, including a communications director, press secretary, and head of media relations; Bernie Sanders, whose brand is arguably the “un-brand,” has the firm Revolution Messaging behind his social media image. Interestingly, (but not surprisingly) I had to dig for this information in ways I didn’t have to with other candidates. When your brand is the un-brand, your team quickly deletes information about PR on your Wikipedia page; And Donald Trump, in an ironic and oddly brilliant move, maintains authenticity by owning his brand status. Perhaps this strategy was suggested by his ever-present Media Handler, Hope Hicks.
Importantly, political hyperauthenticity relies upon a compliant audience. The performativity of political campaigns is an open secret, balancing between cynical recognition and practical denial. We know the performance for what it is, but allow the performance to go off as though spontaneous. Indeed, we insist upon it. This is what sociologist Erving Goffman calls “tact” and it’s a practice that, for good reason, pervades everyday life. Tact facilitates smooth interaction and helps us fumbling social actors avoid embarrassment. It’s how we manage to carry on, even after someone trips, farts, misspeaks, or leaves spinach in their teeth. The maintenance of political theater requires audiences to dig deep into their reservoirs of civil inattention. Political theater requires hypertact.
The masterful craft of political campaigns is common knowledge. We all realize that Bernie Sanders entered last week’s debate with “sick and tired of hearing about your damn emails!” ready made in his quote bag. In this way, we didn’t expect Rand Paul’s 24 hour livestream to contain illegal, immoral, or even revealing content. We expected it to contain coffees, bad jokes, small talk, and mussed hair. We expected it to be boring with spurts of amusement, and the Paul team delivered. The most dramatic moment was Paul’s reference to the “dumbass livestream,” which quickly became a t-shirt slogan and press release from his team. “Look at how authentic Rand Paul is!! He said an unpolished thing!!”
However, despite extensive teams and apparent audience complicity, candidates’ social media use sometimes allows nuggets of ill-crafted content to seep through, and well-crafted resistance to break in. People didn’t find it cute when Clinton asked them to tweet emojis about student debt. People did find it funny, however, to watch Ted Cruz botch his first attempted response to Obama’s State of the Union address. And the people running Carlyfiorina.org took advantage of an unregistered domain name to highlight the number of workers laid off at HP under Fiorina’s leadership. Social media is therefore a tool of the powerful, bolstered by citizens’ tact, but it is also a tool that contains unique vulnerabilities. The opportunity to screw up is ever available, and when it happens, it doesn’t go away. It can loop on vine, spread through retweets, it can become a meme. In candidates’ digitally mediated quest for hyperauthenticity, social media can also, despite itself, occasionally pull back the curtain. And when the curtain pulls back, we pounce, lest the sham of the entirety be revealed.
We live in a cyborg society. Technology has infiltrated the most fundamental aspects of our lives: social organization, the body, even our self-concepts. This blog chronicles our new, augmented reality.