history

Recently there’s been heightened attention to calling out microaggressions and giving trigger warnings. I recently speculated that the loudest voices making these demands come from people in categories that have gained in power but are still not dominant, notably women at elite universities.  What they’re saying in part is, “We don’t have to take this shit anymore.” Or as Bradley Campbell and Jason Manning put it in a recently in The Chronicle:

…offenses against historically disadvantaged social groups have become more taboo precisely because different groups are now more equal than in the past.

It’s nice to have one’s hunches seconded by scholars who have given the issue much more thought.

Campbell and Manning make the context even broader. The new “plague of hypersensitivity” (as sociologist Todd Gitlin called it) isn’t just about a shift in power, but a wider cultural transformation from a “culture of dignity” to a “culture of victimhood.” More specifically, the aspect of culture they are talking about is social control. How do you get other people to stop doing things you don’t want them to do – or not do them in the first place?

In a “culture of honor,” you take direct action against the offender.  Where you stand in society – the rights and privileges that others accord you – is all about personal reputation (at least for men). “One must respond aggressively to insults, aggressions, and challenges or lose honor.” The culture of honor arises where the state is weak or is concerned with justice only for some (the elite). So the person whose reputation and honor are at stake must rely on his own devices (devices like duelling pistols).  Or in his pursuit of personal justice, he may enlist the aid of kin or a personalized state-substitute like Don Corleone.

In more evolved societies with a more extensive state, honor gives way to “dignity.”

The prevailing culture in the modern West is one whose moral code is nearly the exact opposite of that of an honor culture. Rather than honor, a status based primarily on public opinion, people are said to have dignity, a kind of inherent worth that cannot be alienated by others. Dignity exists independently of what others think, so a culture of dignity is one in which public reputation is less important. Insults might provoke offense, but they no longer have the same importance as a way of establishing or destroying a reputation for bravery. It is even commendable to have “thick skin” that allows one to shrug off slights and even serious insults, and in a dignity-based society parents might teach children some version of “sticks and stones may break my bones, but words will never hurt me” – an idea that would be alien in a culture of honor.

The new “culture of victimhood” has a different goal – cultural change. Culture is, after all, a set of ideas that is shared, usually so widely shared as to be taken for granted. The microaggression debate is about insult, and one of the crucial cultural ideas at stake is how the insulted person should react. In the culture of honor, he must seek personal retribution. In doing so, of course, he is admitting that the insult did in fact sting. The culture of dignity also focuses on the character of offended people, but here they must pretend that the insult had no personal impact. They must maintain a Jackie-Robinson-like stoicism even in the face of gross insults and hope that others will rise to their defense. For smaller insults, say Campbell and Manning, the dignity culture “would likely counsel either confronting the offender directly to discuss the issue,” which still keeps things at a personal level, “or better yet, ignoring the remarks altogether.”

In the culture of victimhood, the victim’s goal is to make the personal political.  “It’s not just about me…”  Victims and their supporters are moral entrepreneurs. They want to change the norms so that insults and injustices once deemed minor are now seen as deviant. They want to define deviance up.  That, for example, is the primary point of efforts like the Microaggressions Project, which describes microaggressions in exactly these terms, saying that microaggression “reminds us of the ways in which we and people like us continue to be excluded and oppressed” (my emphasis).

5

So, what we are seeing may be a conflict between two cultures of social control: dignity and victimhood. It’s not clear how it will develop. I would expect that those who enjoy the benefits of the status quo and none of its drawbacks will be most likely to resist the change demanded by a culture of victimhood. It may depend on whether shifts in the distribution of social power continue to give previously more marginalized groups a louder and louder voice.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

There is a whole social science to the optimal balance of victory and defeat in social movements and social change. Consider two political cartoons by Mike Luckovich. One, from June 21, counterposes a person carrying a sign saying “black lives matter” and a Confederate flag subtitled “S. Carolina rebuttal.” Another, from June 25, features a black man weighted down by chains and padlocks saying “voter ID laws,” inequality,” “police brutality,” and “mass jailings.” A white man in front of him jumps up high and lifts his arm, saying “The Confederate flag’s coming down, high five!”

Did he really just demand the removal of the Confederate flag and then mock people who would celebrate its removal? Is that how much things change in a week? But in periods of social change, moving the goal posts is what it’s all about. And there’s nothing wrong with that.

The Charleston massacre was a horrific reminder of how it seems some things never change. But they do change. Dylann Roof was caught and may be put to death, legally. And it turned out that, not only had the Confederate flag only been flying at the South Carolina capitol for a few decades, but it actually could be taken down in response to public outrage. And yet, that’s not the end of racism.

Anthea Butler, a religion and Africana studies professor at Penn, who wrote an op-ed in the Washington Post, was on the On Point radio show. She was talking to host Tom Ashbrook, when she got this:

Tom Ashbrook: If you ask me, I understand that feeling and that vivid response. At the same time, I, and maybe you, Anthea Butler, Dr. Butler, don’t want to lose, or not recognize, or lose the progress that has been made. And this is nowhere near paradise…

Anthea Butler: But what kind of progress? What kind of progress? This is what we keep talking about. And I don’t understand, when you say, “We’ve made progress.” How have we made progress when the president of the United States has been constantly questioned because he is partially a Black man? And so you talk progress — and this is the kind of talk we’re going to hear all week long after this.

TA: But he’s president, madam.

AB: He is president.

TA: Well, that’s a pretty big deal…

AB: That is a big deal, but to some people in this country, like Dylann Roof, that is the end of this country. That’s why you had the kind of phrase that he said, that all your politicians, the right Republican politicians have been saying, “Take our country back.” And so, I want to talk about the rhetoric that’s happened…

Ashbrook has a point about progress, of course, but it’s just the wrong time to say that, days after a racist massacre that seems as timeless as a Black-churches burning. At that moment there could be no progress.

For whatever reason, Ashbrook turned to progress on the interpersonal level:

TA: We did see White people in South Carolina, in Charleston, pour into the churches alongside African Americans over this weekend.

AB: Yes we did. But you need to understand the distinction here. I don’t doubt that there are well-meaning, good White people, good White Christians, who are appalled at this. I understand that. But when you have a structural system that continues to do this kind of racial profiling, the kinds of things that are going on with the police in this country, the kinds of issues that we’ve had. The problem becomes this: you can talk about progress all you want, but reality is another thing altogether.

Again, it’s progress, but focusing on it at that moment is basically #AllLivesMatter. President Obama also tried to keep his eyes on the prize, in his appearance on the WTF podcast:

Racism, we are not cured of it. And it’s not just a matter of it not being polite to say “nigger” in public. That’s not the measure of whether racism still exists or not. It’s not just a matter of overt discrimination. Societies don’t, overnight, completely erase everything that happened 200 to 300 years prior.

Outrage ensued about his use of “nigger,” but White House Press Secretary Josh “earnest non-racist white guy” Earnest doubled down:

The President’s use of the word and the reason that he used the word could not be more apparent from the context of his discussion on the podcast.  The President made clear that it’s not possible to judge the nation’s progress on race issues based solely on an evaluation of our country’s manners.  The fact is that we’ve made undeniable progress in this country over the last several decades, and as the President himself has often said, anyone who lived in this country through the ‘50s and the ‘60s and the ‘70s and the ‘80s notes the tremendous progress that we’ve made.  That progress is undeniable. But what’s also undeniable is that there is more work that needs to be done, and there’s more that we can do.  And the fact is everyone in this country should take some inspiration from the progress that was made in the previous generation and use that as a motivation and an inspiration to try to make further progress toward a more perfect union.

Now is no time to talk about progress, some say. With Black church members being gunned down and churches burning, and one appalling, outrageous video after another showing the abuse of Black citizens by police, having a Black president is not a victory. So much so that maybe he’s not really Black at all. Frank Roberts writes of Obama’s “Amazing Grace” moment:

With Obama … blackness has been reduced to a theatrical prop; a shuck-and-jive entertainment device that keeps (black) audiences believing that the President “feels their pain” — at precisely the same time that he fails to provide a substantive policy response to black unemployment, over-incarceration, and/or racialized state violence.

The social scientist in me objects, because the rate of progress is not determined by the victory or tragedy of the moment, or by the blackness of a man. And Obama probably has done more than any other president (at least recently) to address Black unemployment, incarceration, and racialized state violence. That’s not a moral or political statement — and it doesn’t imply “enough” — it’s an empirical one.

Movements use good news for legitimacy and bad news for urgency.  When something goes well, they need to claim credit and also make sure their supporters know there is more work to be done. When something awful happens they place the troubles in the context of a narrative of struggle, but they don’t want to appear powerless because that saps support and undermines morale.

An extended version of this post is at Family Inequality.

Philip N. Cohen is a professor of sociology at the University of Maryland, College Park. He writes the blog Family Inequality and is the author of The Family: Diversity, Inequality, and Social Change. You can follow him on Twitter or Facebook.

If Mexicans celebrated the 4th like Americans celebrate Cinco de Mayo:

From Flama.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

I was on jury duty this week, and the greatest challenge for me was the “David Brooks temptation” to use the experience to expound on the differences in generations and the great changes in culture and character that technology and history have brought.

I did my first tour of duty in the 1970s. Back then you were called for two weeks. Even if you served on a jury, after that trial ended, you went back to the main jury room. If you were lucky, you might be released after a week and a half. Now it’s two days.

What most struck me most this time was the atmosphere in the main room. Now, nobody talks. You’re in a large room with maybe two hundred people, and it’s quieter than a library. Some are reading newspapers or books, but most are on their latops, tablets, and phones. In the 1970s, it wasn’t just that there was no wi-fi, there was no air conditioning. Remember “12 Angry Men”? We’re in the same building. Then, you tried to find others to talk to. Now you try to find a seat near an electric outlet to connect your charger.

2 (1)

I started to feel nostalgic for the old system. People nowadays – all in their own narrow, solipsistic worlds, nearly incapable of ordinary face-to-face sociability. And so on.

But the explanation was much simpler. It was the two-day hitch. In the old system, social ties didn’t grow from strangers seeking out others in the main jury room. It happened when you went to a courtroom for voir dire. You were called down in groups of forty. The judge sketched out the case, and the lawyers interviewed the prospective jurors. From their questions, you learned more about the case, and you learned about your fellow jurors – neighborhood, occupation, family, education, hobbies. You heard what crimes they’d been a victim of.  When judge called a break for bathroom or lunch or some legal matter, you could find the people you had something in common with. And you could talk with anyone about the case, trying to guess what the trial would bring. If you weren’t selected for the jury, you went back to the main jury room, and you continued the conversations there. You formed a social circle that others could join.

This time, on my first day, there were only two calls for voir dire, the clerk as bingo-master spinning the drum with the name cards and calling out the names one by one. My second day, there were no calls. And that was it. I went home having had no conversations at all with any of my fellow jurors. (A woman seated behind me did say, “Can you watch my laptop for a second?” when she went to the bathroom, but I don’t count that as a conversation.)

I would love to have written 800 words here on how New York character had changed since the 1970s.  No more schmoozing. Instead we have iPads and iPhones and MacBooks destroying New York jury room culture – Apple taking over the Apple. People unable or afraid to talk to one another because of some subtle shift in our morals and manners. Maybe I’d even go for the full Brooks and add a few paragraphs telling you what’s really important in life.

But it was really a change in the structure. New York expanded the jury pool by eliminating most exemptions. Doctors, lawyers, politicians, judges – they all have to show up. As a result, jury service is two days instead of two weeks, and if you actually are called to a trial, once you are rejected for the jury or after the trial is over, you go home.

The old system was sort of like the pre-all-volunteer army. You get called up, and you’re thrown together with many kinds of people you’d never otherwise meet. It takes a chunk of time out of your life, but you wind up with some good stories to tell. Maybe we’ve lost something. But if we have lost valuable experiences, it’s because of a change in the rules, in the structure of how the institution is run, not a because of a change in our culture and character.

Cross-posted  at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

All eyes are on the Confederate flag, but let’s not forget what enabled Roof to turn his ideology into death with such efficiency, as illustrated in this comic from Jonathan Schmock. Visit Schmock’s website here.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Flashback Friday.

My great-grandma would put a few drops of turpentine on a sugar cube as a cure-all for any type of cough or respiratory ailment. Nobody in the family ever had any obvious negative effects from it as far as I know. And once when I had a sinus infection my grandma suggested that I try gargling kerosene. I decided to go to the doctor for antibiotics instead, but most of my relatives thought that was a perfectly legitimate suggestion.

In the not-so-recent history, lots of substances we consider unhealthy today were marketed and sold for their supposed health benefits. Joe A. of Human Rights Watch sent in these images of vintage products that openly advertised that they contained cocaine or heroin. Perhaps you would like some Bayer Heroin?

Flickr Creative Commons, dog 97209

The Vapor-ol alcohol and opium concoction was for treating asthma:

Cocaine drops for the kids:

A reader named Louise sent in a recipe from her great-grandma’s cookbook. Her great-grandmother was a cook at a country house in England. The recipe is dated 1891 and calls for “tincture of opium”. The recipe (with original spellings):

Hethys recipe for cough mixture

1 pennyworth of each
Antimonial Wine
Acetic Acid
Tincture of opium
Oil of aniseed
Essence of peppermint
1/2lb best treacle

Well mix and make up to Pint with water.

As Joe says, it’s no secret that products with cocaine, marijuana, opium, and other now-banned substances were at one time sold openly, often as medicines. The changes in attitudes toward these products, from entirely acceptable and even beneficial to inherently harmful and addicting, is a great example of social construction. While certainly opium and cocaine have negative effects on some people, so do other substances that remained legal (or were re-legalized, in the case of alcohol).

Often racist and anti-immigrant sentiment played a role in changing views of what are now illegal controlled substances; for instance, the association of opium with Chinese immigrants contributed to increasingly negative attitudes toward it as anything associated with Chinese immigrants was stigmatized, particularly in the western U.S. This combined with a push by social reformers to prohibit a variety of substances, leading to the Harrison Narcotic Act. The act, passed in 1914, regulated production and distribution of opium but, in its application, eventually basically criminalized it.

Reformers pushing for cocaine to be banned suggested that its effects led Black men to rape White women, and that it gave them nearly super-human strength that allowed them to kill Whites more effectively. A similar argument was made about Mexicans and marijuana:

A Texas police captain summed up the problem: under marijuana, Mexicans became “very violent, especially when they become angry and will attack an officer even if a gun is drawn on him. They seem to have no fear, I have also noted that under the influence of this weed they have enormous strength and that it will take several men to handle one man while under ordinary circumstances one man could handle him with ease.”

So the story of the criminalization of some substances in the U.S. is inextricably tied to various waves of anti-immigrant and racist sentiment. Some of the same discourse–the “super criminal” who is impervious to pain and therefore especially violent and dangerous, the addicted mother who harms and even abandons her child to prostitute herself as a way to get drugs–resurfaced as crack cocaine emerged in the 1980s and was perceived as the drug of choice of African Americans.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

The governors of Virginia and South Carolina have now taken stands against the Confederate battle flag. So have honchos at Wal*Mart, Sears, Target, and NASCAR.

NASCAR! How could this cascade of reversals have happened so rapidly? Did these important people wake up one morning this week and say to themselves, “Gee, I never realized that there was anything racist about the Confederacy, and never realized that there was anything wrong with racism, till that kid killed nine Black people in a church”?

My guess is that what’s going on is not a sudden enlightenment or even much of a change in views about the flag. To me it looks more like the process of “pluralistic ignorance.” What these people changed was not their ideas about the Confederacy or racism but their ideas about other people’s ideas about these matters. With pluralistic ignorance (a term coined by Floyd Allport nearly a century ago) everyone wants X but thinks that nobody else does. Then some outside factor makes it possible for people to choose X, and everyone does. Everyone is surprised – “Gee, I thought all you guys wanted Y, not X .” It looks like a rapid change in opinion, but it’s not.

A few years ago in places like Ireland and Europe, people were surprised at the success of new laws banning smoking in pubs and restaurants. “Oh, the smokers will never stand for it.” But it turned out that the smokers, too, were quite happy to have rooms with breathable air. It’s just that before the laws were passed, nobody knew that’s how other people felt because those people kept smoking.

The same thing happened when New York City passed a pooper-scooper law. “The law is unenforceable,” people said. “Cops will never see the actual violation, only its aftermath. And do you really think that those selfish New Yorkers will sacrifice their own convenience for some vague public good?” But the law was remarkably effective. As I said in this post from 2009:

Even before the new law, dog owners had probably thought that cleaning up after their dogs was the right thing to do, but since everyone else was leaving the stuff on the sidewalk, nobody wanted to be the only schmuck in New York to be picking up dog shit. In the same way that the no-smoking laws worked because smokers wanted to quit, the dog law in New York worked because dog owners really did agree that they should be cleaning up after their dogs. But prior to the law, none of them would speak or act on that idea.

In South Carolina and Georgia and Bentonville, Arkansas and elsehwere, the governors and the CEOs surely knew that the Confederacy was based on racist slavery; they just rarely thought about it. And if the matter did come up, as with the recent Supreme Court decision about license plates, they probably assumed that most of their constituents and customers were happy with the flag and that the anti-flaggers were a cranky minority.

With the support for letting that flag fade into history, it looks as though for a while now many Southerners may have been uncomfortable with the blatant racism of the Confederacy and the post-Reconstruction era. But because nobody voiced that discomfort, everyone thought that other Southerners still clung to the old mentality. The murders in the Charleston church and the subsequent discussions about retiring the flag may have allowed Southerners to discover that their neighbors shared their misgivings about the old racism. And it allowed the retail giants to see that they weren’t going to lose a lot of money by not stocking the flag.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

In the aftermath of Dylann Roof’s racist murder, some cities in the South are reconsidering their relationship to the Confederate Flag. Should it fly? Be in a museum? Burn? The discussion raises larger questions of how to move forward from ugly histories without simultaneously whitewashing a city’s past. And, as well, how do we know when something is truly in our past?

I was thinking about just these questions a couple weeks ago when a friend of mine walked me by the monument to the Crescent City White League in New Orleans. The conical stone was erected to commemorate the return of white supremacist government two years after a lethal insurrection against the Reconstruction state government in 1874. In that insurrection, thousands of former Confederate soldiers attacked the city police and state military. They killed 11 members of the NOPD and held city government buildings for three days before federal troops arrived and they fled.

Two years later, the white supremacist politicians were back in power and they placed the monument in a prominent place where Canal St. meets the Mississippi. The monument, to be clear, is in honor of cop-killing white supremacists.

Here it is in 1906 (source, photographer unknown):14

So, what to do with the thing?

In 1974 — one hundred years after the insurrection and 98 years after its erection — the city added a marker nearby distancing itself from the message of white supremacy. It read:

Although the “battle of Liberty Place” and this monument are important parts of the New Orleans history, the sentiments in favor of white supremacy expressed thereon are contrary to the philosophy and beliefs of present-day New Orleans.

In 1993, some of the original inscriptions were removed and replaced with this slightly more politically correct comment:

In honor of those Americans on both sides who died in the Battle of Liberty Place. … A conflict of the past that should teach us lessons for the future.

It was also moved to a new location. Today it sits between a flood wall, a parking lot, and an electrical substation. If you wanted to give a monument the finger, this is one way to do it. Here’s how it looks on Google Maps streetview:

3 4

So, the question is: What to do with these things?

I’ll admit that seeing the monument tucked into an unpleasant corner of New Orleans was somehow satisfying. But I was also uneasy about its displacement. Is this an example of New Orleans trying to repress knowledge of its racist history? (And present?) Or is it a sign that the city actively rejects the values represented by the monument? Conversely, if the city had left the monument at the foot of Canal St. would this be a sign that it took history seriously? And, thus, responsibility for its past? Or a sign that it didn’t take an anti-racist stance seriously enough?

This seems like an obviously difficult call to make, but I’m glad that we’re using the horror of Roof’s massacre to begin a discussion about how to handle symbols like these and, maybe, truly make them a part of our past.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.