culture

Recently there’s been heightened attention to calling out microaggressions and giving trigger warnings. I recently speculated that the loudest voices making these demands come from people in categories that have gained in power but are still not dominant, notably women at elite universities.  What they’re saying in part is, “We don’t have to take this shit anymore.” Or as Bradley Campbell and Jason Manning put it in a recently in The Chronicle:

…offenses against historically disadvantaged social groups have become more taboo precisely because different groups are now more equal than in the past.

It’s nice to have one’s hunches seconded by scholars who have given the issue much more thought.

Campbell and Manning make the context even broader. The new “plague of hypersensitivity” (as sociologist Todd Gitlin called it) isn’t just about a shift in power, but a wider cultural transformation from a “culture of dignity” to a “culture of victimhood.” More specifically, the aspect of culture they are talking about is social control. How do you get other people to stop doing things you don’t want them to do – or not do them in the first place?

In a “culture of honor,” you take direct action against the offender.  Where you stand in society – the rights and privileges that others accord you – is all about personal reputation (at least for men). “One must respond aggressively to insults, aggressions, and challenges or lose honor.” The culture of honor arises where the state is weak or is concerned with justice only for some (the elite). So the person whose reputation and honor are at stake must rely on his own devices (devices like duelling pistols).  Or in his pursuit of personal justice, he may enlist the aid of kin or a personalized state-substitute like Don Corleone.

In more evolved societies with a more extensive state, honor gives way to “dignity.”

The prevailing culture in the modern West is one whose moral code is nearly the exact opposite of that of an honor culture. Rather than honor, a status based primarily on public opinion, people are said to have dignity, a kind of inherent worth that cannot be alienated by others. Dignity exists independently of what others think, so a culture of dignity is one in which public reputation is less important. Insults might provoke offense, but they no longer have the same importance as a way of establishing or destroying a reputation for bravery. It is even commendable to have “thick skin” that allows one to shrug off slights and even serious insults, and in a dignity-based society parents might teach children some version of “sticks and stones may break my bones, but words will never hurt me” – an idea that would be alien in a culture of honor.

The new “culture of victimhood” has a different goal – cultural change. Culture is, after all, a set of ideas that is shared, usually so widely shared as to be taken for granted. The microaggression debate is about insult, and one of the crucial cultural ideas at stake is how the insulted person should react. In the culture of honor, he must seek personal retribution. In doing so, of course, he is admitting that the insult did in fact sting. The culture of dignity also focuses on the character of offended people, but here they must pretend that the insult had no personal impact. They must maintain a Jackie-Robinson-like stoicism even in the face of gross insults and hope that others will rise to their defense. For smaller insults, say Campbell and Manning, the dignity culture “would likely counsel either confronting the offender directly to discuss the issue,” which still keeps things at a personal level, “or better yet, ignoring the remarks altogether.”

In the culture of victimhood, the victim’s goal is to make the personal political.  “It’s not just about me…”  Victims and their supporters are moral entrepreneurs. They want to change the norms so that insults and injustices once deemed minor are now seen as deviant. They want to define deviance up.  That, for example, is the primary point of efforts like the Microaggressions Project, which describes microaggressions in exactly these terms, saying that microaggression “reminds us of the ways in which we and people like us continue to be excluded and oppressed” (my emphasis).

5

So, what we are seeing may be a conflict between two cultures of social control: dignity and victimhood. It’s not clear how it will develop. I would expect that those who enjoy the benefits of the status quo and none of its drawbacks will be most likely to resist the change demanded by a culture of victimhood. It may depend on whether shifts in the distribution of social power continue to give previously more marginalized groups a louder and louder voice.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

I was on jury duty this week, and the greatest challenge for me was the “David Brooks temptation” to use the experience to expound on the differences in generations and the great changes in culture and character that technology and history have brought.

I did my first tour of duty in the 1970s. Back then you were called for two weeks. Even if you served on a jury, after that trial ended, you went back to the main jury room. If you were lucky, you might be released after a week and a half. Now it’s two days.

What most struck me most this time was the atmosphere in the main room. Now, nobody talks. You’re in a large room with maybe two hundred people, and it’s quieter than a library. Some are reading newspapers or books, but most are on their latops, tablets, and phones. In the 1970s, it wasn’t just that there was no wi-fi, there was no air conditioning. Remember “12 Angry Men”? We’re in the same building. Then, you tried to find others to talk to. Now you try to find a seat near an electric outlet to connect your charger.

2 (1)

I started to feel nostalgic for the old system. People nowadays – all in their own narrow, solipsistic worlds, nearly incapable of ordinary face-to-face sociability. And so on.

But the explanation was much simpler. It was the two-day hitch. In the old system, social ties didn’t grow from strangers seeking out others in the main jury room. It happened when you went to a courtroom for voir dire. You were called down in groups of forty. The judge sketched out the case, and the lawyers interviewed the prospective jurors. From their questions, you learned more about the case, and you learned about your fellow jurors – neighborhood, occupation, family, education, hobbies. You heard what crimes they’d been a victim of.  When judge called a break for bathroom or lunch or some legal matter, you could find the people you had something in common with. And you could talk with anyone about the case, trying to guess what the trial would bring. If you weren’t selected for the jury, you went back to the main jury room, and you continued the conversations there. You formed a social circle that others could join.

This time, on my first day, there were only two calls for voir dire, the clerk as bingo-master spinning the drum with the name cards and calling out the names one by one. My second day, there were no calls. And that was it. I went home having had no conversations at all with any of my fellow jurors. (A woman seated behind me did say, “Can you watch my laptop for a second?” when she went to the bathroom, but I don’t count that as a conversation.)

I would love to have written 800 words here on how New York character had changed since the 1970s.  No more schmoozing. Instead we have iPads and iPhones and MacBooks destroying New York jury room culture – Apple taking over the Apple. People unable or afraid to talk to one another because of some subtle shift in our morals and manners. Maybe I’d even go for the full Brooks and add a few paragraphs telling you what’s really important in life.

But it was really a change in the structure. New York expanded the jury pool by eliminating most exemptions. Doctors, lawyers, politicians, judges – they all have to show up. As a result, jury service is two days instead of two weeks, and if you actually are called to a trial, once you are rejected for the jury or after the trial is over, you go home.

The old system was sort of like the pre-all-volunteer army. You get called up, and you’re thrown together with many kinds of people you’d never otherwise meet. It takes a chunk of time out of your life, but you wind up with some good stories to tell. Maybe we’ve lost something. But if we have lost valuable experiences, it’s because of a change in the rules, in the structure of how the institution is run, not a because of a change in our culture and character.

Cross-posted  at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

In the aftermath of Dylann Roof’s racist murder, some cities in the South are reconsidering their relationship to the Confederate Flag. Should it fly? Be in a museum? Burn? The discussion raises larger questions of how to move forward from ugly histories without simultaneously whitewashing a city’s past. And, as well, how do we know when something is truly in our past?

I was thinking about just these questions a couple weeks ago when a friend of mine walked me by the monument to the Crescent City White League in New Orleans. The conical stone was erected to commemorate the return of white supremacist government two years after a lethal insurrection against the Reconstruction state government in 1874. In that insurrection, thousands of former Confederate soldiers attacked the city police and state military. They killed 11 members of the NOPD and held city government buildings for three days before federal troops arrived and they fled.

Two years later, the white supremacist politicians were back in power and they placed the monument in a prominent place where Canal St. meets the Mississippi. The monument, to be clear, is in honor of cop-killing white supremacists.

Here it is in 1906 (source, photographer unknown):14

So, what to do with the thing?

In 1974 — one hundred years after the insurrection and 98 years after its erection — the city added a marker nearby distancing itself from the message of white supremacy. It read:

Although the “battle of Liberty Place” and this monument are important parts of the New Orleans history, the sentiments in favor of white supremacy expressed thereon are contrary to the philosophy and beliefs of present-day New Orleans.

In 1993, some of the original inscriptions were removed and replaced with this slightly more politically correct comment:

In honor of those Americans on both sides who died in the Battle of Liberty Place. … A conflict of the past that should teach us lessons for the future.

It was also moved to a new location. Today it sits between a flood wall, a parking lot, and an electrical substation. If you wanted to give a monument the finger, this is one way to do it. Here’s how it looks on Google Maps streetview:

3 4

So, the question is: What to do with these things?

I’ll admit that seeing the monument tucked into an unpleasant corner of New Orleans was somehow satisfying. But I was also uneasy about its displacement. Is this an example of New Orleans trying to repress knowledge of its racist history? (And present?) Or is it a sign that the city actively rejects the values represented by the monument? Conversely, if the city had left the monument at the foot of Canal St. would this be a sign that it took history seriously? And, thus, responsibility for its past? Or a sign that it didn’t take an anti-racist stance seriously enough?

This seems like an obviously difficult call to make, but I’m glad that we’re using the horror of Roof’s massacre to begin a discussion about how to handle symbols like these and, maybe, truly make them a part of our past.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

One of the important conversations that has began in the wake of Dylann Roof’s racist murder in South Carolina has to do with racism among members of the Millennial generation. We’ve placed a lot of faith in this generation to pull us out of our racist path, but Roof’s actions may help remind us that racism will not go away simply by the passing of time.

In fact, data from the General Social Survey — one of the most trusted social science data sets — suggests that Millennials are failing to make dramatic strides toward a non-racist utopia. Scott Clement, at the Washington Post, shows us the data. Attitudes among white millennials (in green below) are statistically identical to whites in Generation X (yellow) and hardly different from Baby Boomers on most measures (orange). Whites are about as likely as Generation X:

  • to think that blacks are lazier or less hardworking than whites
  • to think that blacks have less motivation than whites to do well
  • to oppose living in a neighborhood that is 50% or more black
  • to object if a relative marries a black person

And they’re slightly more likely than white members of Generation X to think that blacks are less intelligent than whites. So much for a Millennial rescue from racism.

4 5 6 7 8

All in all, white millennial attitudes are much more similar to those of older whites than they are to those of their peers of color.

***

At PBS, Mychal Denzel Smith argues that we are reaping the colorblindness lessons that we’ve sowed. Millennials today may think of themselves as “post-racial,” but they’ve learned none of the skills that would allow them to get there. Smith writes:

Millennials are fluent in colorblindness and diversity, while remaining illiterate in the language of anti-racism.

They know how to claim that they’re not racist, but they don’t know how to recognize when they are and they’re clueless as to how to actually change our society for the better.

So, thanks to the colorblindness discourse, white Millennials are quick to see racism as race-neutral. In one study, for example, 58% of white millennials said they thought that “reverse racism” was as big a problem as racism.

Smith summarizes the problem:

For Millennials, racism is a relic of the past, but what vestiges may still exist are only obstacles if the people affected decide they are. Everyone is equal, they’ve been taught, and therefore everyone has equal opportunity for success. This is the deficiency found in the language of diversity. … Armed with this impotent analysis, Millennials perpetuate false equivalencies, such as affirmative action as a form of discrimination on par with with Jim Crow segregation. And they can do so while not believing themselves racist or supportive of racism.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Saturday night, I went to the 7:30 showing of “Me and Earl and the Dying Girl.” The movie had just opened, so I went early. I didn’t want the local teens to grab the all the good seats – you know, that thing where maybe four people from the group are in the theater but they’ve put coats, backpacks, and other place markers over two dozen seats for their friends, who eventually come in five minutes after they feature has started.

That didn’t happen. The theater (the AMC on Broadway at 68th St.) was two-thirds empty (one-third full if you’re an optimist), and there were no teenagers. Fox Searchlight, I thought, is going to have to do a lot of searching to find a big enough audience to cover the $6 million they paid for the film at Sundance. The box office for the first weekend was $196,000 which put it behind 19 other movies.

But don’t write off “Me and Earl” as a bad investment. Not yet. According to a story in Variety, Searchlight is looking that “Me and Earl” will be to the summer of 2015 what “Napoleon Dynamite” was to the summer of 2004. Like “Napoleon Dynamite,” “Me and Earl” was a festival hit but with no established stars and debt director (though Gomez-Rejon has done television – several “Glees” and “American Horror Storys”). “Napoleon” grossed only $210,000 its first week, but its popularity kept growing – slowly at first, then more rapidly as word spread – eventually becoming cult classic. Searchlight is hoping that “Me and Earl” follows a similar path.

The other important similarity between “Napoleon” and “Earl” is that both were released in the same week as a Very Big Movie – “Harry Potter and the Prisoner of Azkaban” in 2004, “Jurassic World” last weekend. That too plays a part in how a film catches on (or doesn’t).

In an earlier post I graphed the growth in cumulative box office receipts for two movies – “My Big Fat Greek Wedding” and “Twilight.”  The shapes of the curves illustrated two different models of the diffusion of ideas.  In one (“Greek Wedding”), the influence came from within the audience of potential moviegoers, spreading by word of mouth. In the other (“Twilight”), impetus came from outside – highly publicized news of the film’s release hitting everyone at the same time. I was working from a description of these models in sociologist Gabriel Rossman’s Climbing the Charts.

You can see these patterns again in the box office charts for the two movies from the summer of  2004 – “Harry Potter/Azkaban” and “Napoleon Dynamite.” (I had to use separate Y-axes in order to get David and Goliath on the same chart; data from BoxOfficeMojo.)

2

“Harry Potter” starts huge, but after the fifth week the increase in total box office tapers off quickly. “Napoleon Dynamite” starts slowly. But in its fifth or sixth week, its box office numbers are still growing, and they continue to increase for another two months before finally dissipating. The convex curve for “Harry Potter” is typical where the forces of influence are “exogenous.” The more S-shaped curve of “Napoleon Dynamite” usually indicates that an idea is spreading within the system.

But the Napoleon curve is not purely the work of the internal dynamics of word-of-mouth diffusion. The movie distributor plays an important part in its decisions about how to market the film – especially when and where to release the film. The same is true of “Harry Potter.”

The Warner Bros. strategy for “Harry Potter” was to open big – in theaters all over the country. In some places, two or more of the screens at the multi-plex would be running the film. After three weeks, the movie began to disappear from theaters, going from 3,855 screens in week #3 to 605 screens in week #9.

4

“Napoleon Dynamite” opened in only a small number of theaters – six to be exact.  But that number increased steadily until by week #17, it was showing in more than 1,000 theaters.

12

It’s hard to separate the exogenous forces of the movie business from the endogenous word-of-mouth – the biz from the buzz.  Were the distributor and theater owners responding to an increased interest in the movie generated from person to person? Or were they, through their strategic timing of advertising and distribution, helping to create the film’s popularity? We can’t know for sure, but probably both kinds of influence were happening. It might be clearer when the economic desires of the business side and the preferences of the audience don’t match up, for example, when a distributor tries to nudge a small film along, getting it into more theaters and spending more money on advertising, but nobody’s going to see it. This discrepancy would clearly show the influence of word-of-mouth; it’s just that the word would be, “don’t bother.”

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Flashback Friday.

Back when I was in high school and college, I learned that one of the major things that separated humans from other species was culture. The ability to develop distinct ways of living that include an understanding of symbols, language, and customs unique to the group was a specifically human trait.

And, ok, so it turned out that other species had more complex communication systems than we thought they did, but still, other animals were assumed to behave according to instinct, not community-specific cultures.

But as with so many things humans have been convinced we alone possess, it’s turning out that other species have cultures, too. One of the clearest examples is the division of orcas into two groups with distinct customs and eating habits; one eats mammals while the other is pescetarian, eating only fish. Though the two groups regularly come in contact with each other in the wild, they do not choose to intermingle or mate with one another. Here’s a video:

 

Aside from the obvious implications for our understanding of culture, this brings up an issue in terms of conservation. Take the case of orcas. Some are suggesting that they should be on the endangered species list because the population has declined. What do we do if it turns out at some point that, while the overall orca population is not fully endangered, one of the distinct orca cultural groups is? Is it enough that killer whales still exist, or do we need to think of the cultures separately and try to preserve sufficient numbers of each? In addition to being culturally different, they are functionally non-interchangeable: each group has a different effect on food chains and ecosystems.

Should conservation efforts address not just keeping the overall population alive and functioning, but ensure that the range of cultural diversity within a species is protected? If this situation occurred, should we declare one orca culture as endangered but not the other? Are both ecological niches important?

I love these questions. If we recognize that creatures can have cultures, it challenges our sense of self, but also brings significantly more complexity to the idea of wildlife preservation.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

In the working and middle class neighborhoods of many Southern cities, you fill find rows of “shotgun” houses. These houses are long and narrow, consisting of three or more rooms in a row. Originally, there would have been no indoor plumbing — they date back to the early 1800s in the U.S. — and, so, no bathroom or kitchen.

Here’s a photograph of a shotgun house I took in the 7th ward of New Orleans. It gives you an idea of just how skinny they are.

1c

In a traditional shotgun house, there are no hallways, just doors that take a person from one room to the next. Here’s my rendition of a shotgun floor plan; doors are usually all in a row:

20150411_155115

At nola.com, Richard Campanella describes the possible origins and sociological significance of this housing form. He follows folklorist John Michael Vlach, who has argued that shotgun houses are indigenous to Western and Central Africa, arriving in the American South via Haiti. Campella writes:

Vlach hypothesizes that the 1809 exodus of Haitians to New Orleans after the St. Domingue slave insurrection of 1791 to 1803 brought this vernacular house type to the banks of the Mississippi.

In New Orleans, shotgun houses are found in the parts of town originally settled by free people of color, people who would have identified as Creole, and a variety of immigrants. Outside of New Orleans, we tend to see shotgun houses in places with large black populations.

The house, though, doesn’t just represent a building technique, it tells a story about how families were expected to interact. Shotgun houses offer essentially zero privacy. Everyone has to tromp through everyone’s room to get around the house. There’s no expectation that a child won’t just walk into their parents’ room at literally any time, or vice versa. There’s no way around it.

“According to some theories,” then, Campanella says:

…cultures that produced shotgun houses… tended to be more gregarious, or at least unwilling to sacrifice valuable living space for the purpose of occasional passage.

Cultures that valued privacy, on the other hand, were willing to make this trade-off.

Sure enough, in the part of New Orleans settled by people of Anglo-Saxon descent, shotgun houses are much less common and, instead, homes are more “privacy-conscious.”

Over time, as even New Orleans became more and more culturally Anglo-Saxon — and as the housing form increasingly became associated with poverty — shotguns fell out of favor.  They’re enjoying a renaissance today but, as Campanella notes, many renovations of these historic buildings include a fancy, new hallway.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

It’s always a treat to find a good candidate for our series on babies-who-totally-learn-how-to-do-things. In previous editions, we’ve featured a baby rapperbaby preacher, baby worshipper, and two babies mimicking a conversation.

These videos are entertaining because they’re babies, but the message their actions send is more than just adorable. They remind us of how deeply cultural we are as human beings.

In this edition, baby gives CPR:

There’s nothing natural about giving CPR. There’s no gene, no evolutionary push for that behavior, no particular brain organization, and no special mix or hormones that can explain why that baby can mimic the steps of cardiopulmonary resuscitation. Instead, that baby is learning.

Learning is coded in our genes. We’re deeply and naturally flexible that way, able to learn whatever our particular culture needs and values. Many people make biologically deterministic arguments — ones that draw a causal arrow from our biology to our behavior — but that’s usually wrong. More often, we are biologically designed to be contingent, our behavior is naturally dependent on whatever it is in the world that we encounter.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.