discourse/language

Almost all of the representations of breasts we encounter in the mass media are filtered through the hypothetical heterosexual male gaze. Breasts are objects, things that people desire. Women’s personal, subjective experiences of having breasts is almost never discussed in pop culture. I mean, yes, occasionally two female characters might talk about their breasts, but usually in reference to whether and how they do or fail to attract male attention (e.g., “Is this too much cleavage?” and “I wish I had more cleavage!”). What it feels like to have breasts outside of the context of being a sex object isn’t talked about. There’s a void, a black hole of experience.

The only other common discourse about breasts that comes to mind centers around breastfeeding. In that discourse, the idea that breasts are for men is challenged, but only in favor of the idea that breasts are for babies. In neither discursive context does anyone make the case that breasts are primarily for the people who have them. That the pleasure (and pain) and comfort (and discomfort) that comes with breasts belongs — first and foremost — to female-bodied people.

Last week, I saw something different. Crazy Ex-Girlfriend is an odd little TV show with a couple musical numbers in each episode and one of the numbers last week was called “Heavy Boobs.” It’s safe for work but… maybe not safe for work.

Rachel Bloom‘s song names and describes one subjective experience of breasts. Breasts are “heavy boobs,” she sings, just “sacks of yellow fat” that can weigh on women. In the song, the breast-haver’s experience is centered to the exclusion of what men or babies might want or think or experience. I can’t ever remember seeing that on TV before.

And that’s plenty, but what she and her fellow dancers do with their bodies is even more extraordinary. They defy the rules of sexiness. Their movements are about embodying heavy boobs and that’s it. It’s as if they don’t care one iota about whether a hypothetical heterosexual male will see them. The dance is unapologetically unsexy. No, it’s more than unsexy; it’s asexy. It’s danced neither to repulse or attract men; instead, it’s danced as if sexiness is entirely and completely irrelevant. There’s no male gaze because, in that two minutes, there’s not a man in sight.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Jay Livingston is our regular baby name analyst, but I’m gonna give it a go just this once. Over at Baby Name Wizard, Laura Wattenburg published a chart showing that vowels are on the rise. Both girls and boys names have more vowels in them relative to consonants than they have in the last 150 or so years, and more vowels than the English language overall.

2

Based on the yellow line alone, it’s clear that people think that names with more vowels are more appropriate for girls than boys. So, how to explain the uptick, especially among boys?

For boys, the uptick begins during the revolutions of the 1960s and ’70s. Feminists at that time wanted women to be able to embrace the masculine in themselves, but they wanted men to embrace their feminine sides, too. They got the first thing but not the latter and, ever since, the personalities of both men and women both began measuring more masculine, with women changing more than men.

But then both men’s and women’s names should be becoming more masculine. So, maybe baby names are a special case. I googled around and found a survey (of uninterrogated quality) that found that dads have substantially less influence over a babies’ names than moms do. Accordingly, perhaps baby-naming resists some of the stronger influences toward masculinization that come from men. Maybe mothers, especially in that warm moment of naming their babies, are holding out for that half of the feminist revolution that has proven thus far elusive: the valuing of the feminine in all of us.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

2 (1)Hey, they did a study.

Psychologist Paul Thibodeau and three colleagues decided that it was time to take a closer look at the word “moist,” writing:

The word “moist” … has been the subject of a Facebook page (called “I HATE the word MOIST”) with over 3,000 followers and was rated as the least liked word in the English language by a Mississippi State Poll … ; feature articles have been written in Slate Magazine … and The New Yorker … ; and popular TV shows like“How I Met Your Mother” (“Stuff”) and “The New Girl” (“Birthday”) have devoted entire plot-lines to the comic consequences of word aversion.

Now it’s not just anecdotal. Thibodeau found that between 13 and 21% of people have an aversion to the word.

But why?

20150629_001830

Is it just a gross-sounding word? If so, then people who hate moist should also hate foist and rejoiced. Verdict: No. Hating the sound moist is independent of one’s appreciation for words that rhyme.

Is it because it makes people think of sex? Verdict: Yes! Priming people to think of sex versus, say, cake, makes people dislike the word more. Bonus: People who scored higher on a measure of disgust for bodily functions were more likely than those who scored lower to claim an aversion to the word.

So, if you don’t like the word moist, get your mind out of the gutter. And, if your aversion is severely hampering your life, just think about cake!

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

This November, a wave of student activism drew attention to the problem of racism at colleges and universities in the US.  Sparked by protests at the University of Missouri, nicknamed Mizzou, we saw actions at dozens of colleges. It was a spectacular show of strength and solidarity and activists have won many concessions, including new funding, resignations, and promises to rename buildings.

Activists’ grievances are structural — aimed at how colleges are organized and who is in charge, what colleges teach and who does the teaching, and what values are centered and where they come from — but they are also interpersonal. Student activists of color talked about being subject to overtly racist behavior from others and being on the receiving end of microaggressions, seemingly innocuous commentary from others that remind them that they do not, as a Claremont McKenna dean so poorly put it, “fit the mold.” That dean lost her job after that comment. Many student activists seem to embrace the policing of offensive speech, both the hateful and the ignorant kind.

Negative reactions to this activism was immediate and widespread. Much of it served only to affirm the students’ claims: that we are still a racist society and that we, at best, tolerate our young people of color only if they stay “in their place.” Other times, it was confusion about the kind of world these young people seemed to want to live in. Why, some people asked, would anyone — especially a member of a marginalized population — want to shut down free speech?

Well, it may be that the American love of free speech is waning. The Pew Research Center released data measuring attitudes about censorship. They asked Americans whether they thought the government should be able to prevent people from saying things that are “offensive to minorities.” Millennials — that is, today’s college students — are significantly more likely than any other generation to say that they should.

In fact, the data show a steady decrease in the proportion of Americans who are eager to defend speech that is offensive to minorities. Only 12% of the Silent generation is in favor of censorship, compared to 24% of the Baby Boomers, 27% of Gen X, and 40% of Millennials. Notably, women, Democrats, and non-whites are all more likely than their counterparts to be willing to tolerate government control of speech.

4

Americans still stand out among their national peers. Among European Union countries, 49% of citizens are in favor of censorship, compared to 28% of Americans. If the Millennials have anything to say about it, though, that might be changing. Assuming this is a cohort effect and not an age effect (that is, assuming they won’t change their minds as they age), and with the demographic changes this country will see in the next few decades, we may  very soon look more like Europe on this issue than we do now.

Re-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

The police do not shoot people. Not any more. Apparently, the word shoot has been deleted from the cop-speak dictionary.

A recently released video shows a Chicago cop doing what most people would describe as shooting a kid. Sixteen times. That’s not the way the Chicago Police Department puts it.

Chicago Tribune: A “preliminary statement” from the police News Affairs division, sent to the media early the next morning, said that after he had refused orders to drop the knife, McDonald “continued to approach the officers” and that as a result “the officer discharged his weapon, striking the offender.”

In Minneapolis, Black Lives Matter is protesting what they think is the shooting of Jamar Clark by a police officer. How wrong they are. The police did not shoot Clark. Instead, according to the Minnesota Bureau of Criminal Apprehension.

MPR News: At some point during an altercation that ensued between the officers and the individual, an officer discharged his weapon, striking the individual.

The police don’t shoot people. They discharge their weapons striking individuals, usually suspects or offenders. A Google search for “officer discharge weapon striking” returns 3.6 million hits.

Worse, the press often doesn’t even bother to translate but instead prints the insipid bureaucratic language of the police department verbatim.

Fearing for their safety and the safety of the public, they fired their guns, striking the suspect.

(Other sources on these stories do put the press-release prose in quotes. Also, in California, officers who discharge their weapons also usually “fear for their safety and the safety of the public.” I would guess that the phrase is part of some statute about police discharging their weapons)

Here’s another example from the Wilkes Barre area:2 (1)The writer nailed the lede: a police officer shot a suspect. But whoever wrote the headline had majored in Technical Language and Obfuscation rather than Journalism.

Does the language make a difference? I don’t know. Suppose the headlines two weeks ago had said, “In Paris, some people discharged their weapons striking individuals.”

Originally posted at Montclair SocioBlog; re-posted at Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The practice of pairing the word “men” (which refers to adults) with “girls” (which does not) reinforces a gender hierarchy by mapping it onto age.  Jason S. discovered an example of this tendency at Halloween Adventure (East Village, NYC) and snapped a picture to send in:

photo

Sara P. found another example, this time from iparty.  The flyer puts a girl and a boy side-by-side in police officer costumes.  The boy’s is labeled “policeman” and the girl’s is labeled “police girl.”

11

This type of language often goes unnoticed, but it sends a ubiquitous gender message about how seriously we should take men and women.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Daniel Drezner once wrote about how international relations scholars would react to a zombie epidemic. Aside from the sheer fun of talking about something as silly as zombies, it had much the same illuminating satiric purpose as “how many X does it take to screw in a lightbulb” jokes. If you have even a cursory familiarity with the field, it is well worth reading.

Here’s my humble attempt to do the same for several schools within sociology.

Public Opinion. Consider the statement that “Zombies are a growing problem in society.” Would you:

  1. Strongly disagree
  2. Somewhat disagree
  3. Neither agree nor disagree
  4. Somewhat agree
  5. Strongly agree
  6. Um, how do I know you’re really with NORC and not just here to eat my brain?

Criminology. In some areas (e.g., Pittsburgh, Raccoon City), zombification is now more common that attending college or serving in the military and must be understood as a modal life course event. Furthermore, as seen in audit studies employers are unwilling to hire zombies and so the mark of zombification has persistent and reverberating effects throughout undeath (at least until complete decomposition and putrefecation). However, race trumps humanity as most employers prefer to hire a white zombie over a black human.

Cultural toolkit. Being mindless, zombies have no cultural toolkit. Rather the great interest is understanding how the cultural toolkits of the living develop and are invoked during unsettled times of uncertainty, such as an onslaught of walking corpses. The human being besieged by zombies is not constrained by culture, but draws upon it. Actors can draw upon such culturally-informed tools as boarding up the windows of a farmhouse, shotgunning the undead, or simply falling into panicked blubbering.

Categorization. There’s a kind of categorical legitimacy problem to zombies. Initially zombies were supernaturally animated dead, they were sluggish but relentlessness, and they sought to eat human brains. In contrast, more recent zombies tend to be infected with a virus that leaves them still living in a biological sense but alters their behavior so as to be savage, oblivious to pain, and nimble. Furthermore, even supernatural zombies are not a homogenous set but encompass varying degrees of decomposition. Thus the first issue with zombies is defining what is a zombie and if it is commensurable with similar categories (like an inferius in Harry Potter). This categorical uncertainty has effects in that insurance underwriters systematically undervalue life insurance policies against monsters that are ambiguous to categorize (zombies) as compared to those that fall into a clearly delineated category (vampires).

Neo-institutionalism. Saving humanity from the hordes of the undead is a broad goal that is easily decoupled from the means used to achieve it. Especially given that human survivors need legitimacy in order to command access to scarce resources (e.g., shotgun shells, gasoline), it is more important to use strategies that are perceived as legitimate by trading partners (i.e., other terrified humans you’re trying to recruit into your improvised human survival cooperative) than to develop technically efficient means of dispatching the living dead. Although early on strategies for dealing with the undead (panic, “hole up here until help arrives,” “we have to get out of the city,” developing a vaccine, etc) are practiced where they are most technically efficient, once a strategy achieves legitimacy it spreads via isomorphism to technically inappropriate contexts.

Population ecology. Improvised human survival cooperatives (IHSC) demonstrate the liability of newness in that many are overwhelmed and devoured immediately after formation. Furthermore, IHSC demonstrate the essentially fixed nature of organizations as those IHSC that attempt to change core strategy (eg, from “let’s hole up here until help arrives” to “we have to get out of the city”) show a greatly increased hazard for being overwhelmed and devoured.

Diffusion. Viral zombieism (e.g. Resident Evil, 28 Days Later) tends to start with a single patient zero whereas supernatural zombieism (e.g. Night of the Living Dead, the “Thriller” video) tends to start with all recently deceased bodies rising from the grave. By seeing whether the diffusion curve for zombieism more closely approximates a Bass mixed-influence model or a classic s-curve we can estimate whether zombieism is supernatural or viral, and therefore whether policy-makers should direct grants towards biomedical labs to develop a zombie vaccine or the Catholic Church to give priests a crash course in the neglected art of exorcism. Furthermore, marketers can plug plausible assumptions into the Bass model so as to make projections of the size of the zombie market over time, and thus how quickly to start manufacturing such products as brain-flavored Doritos.

Social movements. The dominant debate is the extent to which anti-zombie mobilization represents changes in the political opportunity structure brought on by complete societal collapse as compared to an essentially expressive act related to cultural dislocation and contested space. Supporting the latter interpretation is that zombie hunting militias are especially likely to form in counties that have seen recent increases in immigration. (The finding holds even when controlling for such variables as gun registrations, log distance to the nearest army administered “safe zone,” etc.).

Family. Zombieism doesn’t just affect individuals, but families. Having a zombie in the family involves an average of 25 hours of care work per week, including such tasks as going to the butcher to buy pig brains, repairing the boarding that keeps the zombie securely in the basement and away from the rest of the family, and washing a variety of stains out of the zombie’s tattered clothing. Almost all of this care work is performed by women and very little of it is done by paid care workers as no care worker in her right mind is willing to be in a house with a zombie.

Applied micro-economics. We combine two unique datasets, the first being military satellite imagery of zombie mobs and the second records salvaged from the wreckage of Exxon/Mobil headquarters showing which gas stations were due to be refueled just before the start of the zombie epidemic. Since humans can use salvaged gasoline either to set the undead on fire or to power vehicles, chainsaws, etc., we have a source of plausibly exogenous heterogeneity in showing which neighborhoods were more or less hospitable environments for zombies. We show that zombies tended to shuffle towards neighborhoods with low stocks of gasoline. Hence, we find that zombies respond to incentives (just like school teachers, and sumo wrestlers, and crack dealers, and realtors, and hookers, …).

Grounded theory. One cannot fully appreciate zombies by imposing a pre-existing theoretical framework on zombies. Only participant observation can allow one to provide a thick description of the mindless zombie perspective. Unfortunately scientistic institutions tend to be unsupportive of this kind of research. Major research funders reject as “too vague and insufficiently theory-driven” proposals that describe the intention to see what findings emerge from roaming about feasting on the living. Likewise IRB panels raise issues about whether a zombie can give informed consent and whether it is ethical to kill the living and eat their brains.

Ethnomethodology. Zombieism is not so much a state of being as a set of practices and cultural scripts. It is not that one is a zombie but that one does being a zombie such that zombieism is created and enacted through interaction. Even if one is “objectively” a mindless animated corpse, one cannot really be said to be fulfilling one’s cultural role as a zombie unless one shuffles across the landscape in search of brains.

Conversation Analysis.2 (1)

Cross-posted at Code and Culture.

Gabriel Rossman is a professor of sociology at UCLA. His research addresses culture and mass media, especially pop music radio and Hollywood films, with the aim of understanding diffusion processes. You can follow him at Code and Culture.

Opponents of government aid to the poor often argue that the poor are not really poor. The evidence they are fond of is often an inappropriate comparison, usually with people in other countries: “Thus we can say that by global standards there are no poor people in the US at all: the entire country is at least middle class or better” (Tim Worstall in Forbes).  Sometimes the comparison is with earlier times, as in this quote from Heritage’s Robert Rector: “‘Poor’ Americans today are better housed, better fed, and own more property than did the average US citizen throughout much of the 20th Century.”

I parodied this approach in a post a few years ago by using the ridiculous argument that poor people in the US are not really poor and are in fact “better off than Louis XIV because the Sun King didn’t have indoor plumbing.” I mean, I thought the toilet argument was ridiculous. But sure enough, Richard Rahn of the Cato Institute used it in an article last year in the Washington Times, complete with a 17th century portrait of the king:

2Common Folk Live Better Now than Royalty Did in Earlier Times

Louis XIV lived in constant fear of dying from smallpox and many other diseases that are now cured quickly by antibiotics. His palace at Versailles had 700 rooms but no bathrooms…

Barry Ritholtz at Bloomberg has an ingenious way of showing how meaningless this line of thinking his. He compares today not with centuries past but with centuries to come. Consider our hedge-fund billionaires, with private jets whisking them to their several mansions in different states and countries. Are they well off?  Not at all.  They are worse off than the poor of 2215.

Think about what the poor will enjoy a few centuries from now that even the 0.01 percent lack today. … “Imagine, they died of cancer and heart disease, had to birth their own babies, and even drove their own cars. How primitive can you get!”

Comparisons with times past or future tell us about progress. They can’t tell us who’s poor today. What makes people rich or poor is what they can buy compared with other people in their own society. To extrapolate a line from Mel Brooks’s Louis XVI, “It’s good to be the king . . . even if flush toilets haven’t been invented yet.”

And you needn’t sweep your gaze to distant centuries to find inappropriate comparisons. When Marty McFly in “Back to the Future” goes from the ’80s to the ’50s, he feels pretty cool, even though the only great advances he has over kids there seem to be skateboards, Stratocasters, and designer underpants. How would he have felt if in 1985 he could have looked forward thirty years to see the Internet, laptops, and smartphones?

People below the poverty line today do not feel well off  just because they have indoor plumbing or color TVs or Internet connections. In the same way,  our 1% do not feel poor even though they lack consumer goods that people a few decades from now will take for granted.

Originally posted at Montclair SocioBlog. Re-posted at Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.