This video was making the rounds last spring. The video maker wants to make two points:

1. Cops are racist. They are respectful of the White guy carrying the AR-15. The Black guy gets less comfortable treatment.

2. The police treatment of the White guy is the proper way for police to deal with someone carrying an assault rifle.

I had two somewhat different reactions.

1. This video was made in Oregon. Under Oregon’s open-carry law, what both the White and Black guy are doing is perfectly legal. And when the White guy refuses to provide ID, that’s legal too. If this had happened in Roseburg, and the carrier had been strolling to Umpqua Community College, there was nothing the police could have legally done, other than what is shown in the video, until the guy walked onto campus, opened fire, and started killing people.

2.  Guns are dangerous, and the police know it. In the second video, the cop assumes that the person carrying an AR-15 is potentially dangerous – very dangerous. The officer’s fear is palpable. He prefers to err on the side of caution – the false positive of thinking someone is dangerous when he is really OK.  The false negative – assuming an armed person is harmless when he is in fact dangerous – could well be the last mistake a cop ever makes.

But the default setting for gun laws in the US is just the opposite – better a false negative. This is especially true in Oregon and states with similar gun laws. These laws assume that people with guns are harmless. In fact, they assume that all people, with a few exceptions, are harmless. Let them buy and carry as much weaponry and ammunition as they like.

Most of the time, that assumption is valid. Most gun owners, at least those who got their guns legitimately, are responsible people. The trouble is that the cost of the rare false negative is very, very high. Lawmakers in these states and in Congress are saying in effect that they are willing to pay that price. Or rather, they are willing to have other people – the students at Umpqua, or Newtown, or Santa Monica, or scores of other places, and their parents – pay that price.

UPDATE October, 6You have to forgive the hyperbole in that last paragraph, written so shortly after the massacre at Umpqua. I mean, those politicians don’t really think that it’s better to have dead bodies than to pass regulations on guns, do they?

Or was it hyperbole? Today, Dr. Ben Carson, the surgeon who wants to be the next president of the US, stated even more clearly this preference for guns even at the price of death.  “I never saw a body with bullet holes that was more devastating than taking the right to arm ourselves away.” (The story is in the New York Times and elsewhere.)

Originally posted at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

In a previous post, I wrote about a University of Illinois football coach forcing injured players to go out on the field even at the risk of turning those injuries into lifelong debilitating and career-ending injuries. The coach and the athletic director both stayed on script and insisted that they put the health and well-being of the scholar athletes “above all else.” Right.

My point was that blaming individuals was a distraction and that the view of players as “disposable bodies” (as one player tweeted) was part of a system rather than the moral failings of individuals.

But systems don’t make for good stories. It’s so much easier to think in terms of individuals and morality, not organizations and outcomes. We want good guys and bad guys, crime and punishment. That’s true in the legal system. Convicting individuals who commit their crimes as individuals or in small groups is fairly easy. Convicting corporations or individuals acting as part of a corporation is very difficult.

That preference for stories is especially strong in movies. In that earlier post, I said that the U of Illinois case had some parallels with the NFL and its reaction to the problem of concussions. I didn’t realize that Sony pictures had made a movie about that very topic (title – “Concussion”), scheduled for release in a few months.

Hacked e-mails show that Sony, fearful of lawsuits from the NFL, wanted to shift the emphasis from the organization to the individual.

Sony executives; the director, Peter Landesman; and representatives of Mr. Smith discussed how to avoid antagonizing the N.F.L. by altering the script and marketing the film more as a whistle-blower story, rather than a condemnation of football or the league…

Hannah Minghella, a top [Sony] executive, suggested that “rather than portray the N.F.L. as one corrupt organization can we identify the individuals within the N.F.L. who were guilty of denying/covering up the truth.” [source: New York Times]

I don’t know what the movie will be like, but the trailer clearly puts the focus on one man – Dr. Bennet Omalu, played by Will Smith. He’s the good guy.

Will the film show as clearly how the campaign to obscure and deny the truth about concussions was a necessary and almost inevitable part of the NFL? Or will it give us a few bad guys – greedy, ruthless, scheming NFL bigwigs – and the corollary that if only those positions had been staffed by good guys, none of this would have happened?

The NFL, when asked to comment on the movie, went to the same playbook of cliches that the Illinois coach and athletic director used.

We are encouraged by the ongoing focus on the critical issue of player health and safety. We have no higher priority.

Originally posted at Montclair SocioBlog. Cross-posted at Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

“Asshole is a wonderful word,” said Mike Pesca in his podcast, The Gist. His former colleagues at NPR had wanted to call someone an asshole, and even though it was for a podcast, not broadcast, and even though the person in question was a certified asshole, the NPR censor said no. Pesca disagreed.

Pesca is from Long Island and, except for his college years in Atlanta, he has spent most of his time in the Northeast. Had he hailed from Atlanta – or Denver or Houston or even San Francisco – “asshole” might not have sprung so readily to his mind as le mot juste, even to denote Donald Trump. The choice of swear words is regional.

Linguist Jack Grieve has been analyzing tweets – billions of words – and recently he posted maps showing the relative popularity of different expletives. For example, every county in the Northeast tweets “asshole” at a rate at least two standard deviations above the national mean.

To my knowledge, Grieve has offered no explanation for this distribution, and I don’t have much to add. I assume that as with regional accents, historical factors are more important than the literal meanings of the words. It’s not that tweeters in the Northeast are generally more willing to use foul language, nor is this about anal imagery since the Northeast looks nearly prudish compared to other regions when it comes to “shit.”

Less surprising are the maps of toned-down expletives. People in the heartland are just so gosh darned polite in their speech. When Donald Trump spoke at the Family Leadership Summit in Iowa, what got all the attention was his dissing of John McCain (“He’s not a war hero. … He is a war hero because he was captured. I like people who weren’t captured.”) But there was also this paragraph in the New York Times’s coverage:

Mr. Trump raised eyebrows with language rarely heard before an evangelical audience — saying “damn” and “hell” when discussing education and the economy.

“Well, I was turned off at the very start because I didn’t like his language,” Becky Kruse, of Lovilia, Iowa, said…  Noting Mr. Trump’s comment about not seeking God’s forgiveness. “He sounds like he isn’t really a born-again Christian.”

Aside from the insight about Trump’s religious views, Ms. Kruse reflects the linguistic preferences of her region, where “damn” gets softened to “darn.”

journal.pone.0128832.g001 (1)

Unfortunately, Grieves did not post a map for “heck.” (I remember when “damn” and “hell” were off limits on television, though a newspaper columnist, usually in the sports section, might dare to write something like “It was a helluva fight.”) You can find maps for all your favorite words at Grieve’s website, where you can also find out what words are trending (as we now say) on Twitter. (“Unbothered” is spreading from the South and “fuckboy” is rising). Other words are on the way down (untrending?).  If you’re holding  “YOLO” futures, sell them now before it’s too late.

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The margin of error is getting more attention than usual in the news. That’s not saying much since it’s usually a tiny footnote, like those rapidly muttered disclaimers in TV ads (“Offer not good mumble mumble more than four hours mumble mumble and Canada”). Recent headlines proclaim, “Trump leads Bush…” A paragraph or two in, the story will report that in the recent poll Trump got 18% and Bush 15%.  That difference is well within the margin of error, but you have to listen closely to hear that. Most people usually don’t want to know about uncertainty and ambiguity.

What’s bringing uncertainty out of the closest now is the upcoming Republican presidential debate. The Fox-CNN-GOP axis has decided to split the field of presidential candidates in two based on their showing in the polls. The top ten will be in the main event. All other candidates – currently Jindal, Santorum, Fiorina, et al. – will be relegated to the children’s table, i.e., a second debate a month later and at the very unprime hour of 5 p.m.

But is Rick Perry’s 4% in a recent poll (419 likely GOP voters) really in a different class than Bobby Jindal’s 25? The margin of error that CNN announced in that survey was a confidence interval of  +/- 5.  Here’s the box score.

Jindal might argue that, with a margin of error of 5 points, his 2% might actually be as high as 7%, which would put him in the top tier.He might argue that, but he shouldn’t.  Downplaying the margin of error makes a poll result seem more precise than it really is, but using that one-interval-fits-all number of five points understates the precision. That’s because the margin of error depends on the percent that a candidate gets. The confidence interval is larger for proportions near 50%, smaller for proportions at the extreme.

Just in case you haven’t taken the basic statistics course, here is the formula.

The   (pronounced “pee hat”) is the proportion of the sample who preferred each candidate. For the candidate who polled 50%, the numerator of the fraction under the square root sign will be 0.5 (1-0.5) = .25.  That’s much larger than the numerator for the 2% candidate:  0.02 (1-0.02) = .0196.*Multiplying by the 1.96, the 50% candidate’s margin of error with a sample of 419 is +/- 4.8. That’s the figure that CNN reported. But plug in Jindal’s 2%, and the result is much less: +/- 1.3.  So, there’s a less than one in twenty chance that Jindal’s true proportion of support is more than 3.3%.

Polls usually report their margin of error based on the 50% maximum. The media reporting the results then use the one-margin-fits-all assumption – even NPR. Here is their story from May 29 with the headline “The Math Problem Behind Ranking The Top 10 GOP Candidates”:

There’s a big problem with winnowing down the field this way: the lowest-rated people included in the debate might not deserve to be there.

The latest GOP presidential poll, from Quinnipiac, shows just how messy polling can be in a field this big. We’ve put together a chart showing how the candidates stack up against each other among Republican and Republican-leaning voters — and how much their margins of error overlap.



The NPR writer, Danielle Kurtzleben, does mention that “margins might be a little smaller at the low end of the spectrum,” but she creates a graph that ignores that reality.The misinterpretation of presidential polls is nothing new.  But this time that ignorance will determine whether a candidate plays to a larger or smaller TV audience.


* There are slightly different formulas for calculating the margin of error for very low percentages.  The Agresti-Coull formula gives a confidence interval even if there are zero Yes responses. (HT: Andrew Gelman)

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

I saw “Trainwreck” last night. The 7:00 p.m. showing at the 68th Street AMC was full. Maybe people had come just to get out of the apartment and yet avoid the beastly heat, but they enjoyed the movie.  Sometimes the laughter lasted long enough to cover up the next joke.

The “Trainwreck” story is standard rom-com: Amy Schumer plays a young woman who rejects the idea of commitment and love. Circumstances put her together with a man she seems to have nothing in common with. You can guess the rest.

But this is Amy Schumer’s movie, so there’s an important twist – the conventional sex roles are reversed. It’s the man who is sweet and naive and who wants a real relationship; the woman has a lot of sex with a lot of different guys, drinks a lot, smokes weed, and resists love until at the end, she decides to become the woman he wants her to be.

Here is the R-rated version of the trailer:

What interested me was not the movie itself, but the reaction in some conservative quarters. For Armond White at the National Review, the movie triggered something like what Jonathan Haidt calls “disgust” – a reaction to the violation of strong taboos that surround things like food, sex, blood and other bodily matters, and death. These taboos are often arbitrary, not rational. Pork is an “abomination,” for example, because… well, because it is, and because pigs are “unclean.”

“Trainwreck” has no pork, but it does have what some find unclean.

Schumer’s tampon jokes and gay jokes, female versions of locker-room humor, literally drag pop culture to the toilet. A girl-talk scene set in adjoining restroom stalls — one revealing dropped panties, the other panty-less (obviously Amy) — is just Apatow using women to show off his indecency.

As a comedian and now as a filmmaker, Schumer talks about women-things: body functions and body parts. These jokes seem to elicit two different kinds of laughter. Back when researchers studying small group interaction were trying to code and categorize behavior, laughter posed a problem (see this earlier post). It could be coded as “Shows Tension,” but it might also be “Shows Tension Release.”

With Amy Schumer jokes, the male laughter is mostly a nervous, full of tension about a taboo subject. But the female laughter seems much less inhibited – tension release, maybe even a relief, as if to say, “Someone is finally talking publicly and frankly about things we could only whisper about,” since most of the time they have had to pretend to share the male taboo.

Indecency indeed. But something is indecent only to members of groups that deem it indecent. Some groups are not at all disgusted by pork.  And for some audiences, tampon jokes and toilet-stall conversations about Johnny Depp movies are not indecent; they’re just funny. What audiences might those be? Women.

Take the tampon joke that the National Reviewer finds indecent. It would seem obvious that used tampons look different depending on where you are in your period – less bloody on the final day, more so a few days earlier. But at the mere mention of this fact in “Trainwreck,” hilarity ensues, especially among women in the audience.

The thing about taboos – ideas about what is indecent or disgusting – is that entire social structures get built around them. To violate the taboo is to threaten the entire edifice. Powerful taboos on women-things often go with male domination. So for the National Review, the “Trainwreck”reversal of rom-com gender roles makes the movie dangerous and subversive.

Here are some excerpts from the review just to give the flavor of this Purity-and-Danger-like conflating of taboo, female sexuality, and social/political threat to the established order (emphasis mine):

Schumer turns female sexual prerogative into shamelessness

the degradation of sex — and women

uses sex to promote feminist permissiveness.

She enjoys a sexual license

Amy brazenly practices the same sexual habits as men

. . . old-fashioned sense of shame,

It’s merely brazen, like Lena Dunham’s HBO series, Girls (also about a promiscuous female writer

Schumer’s film can be seen to distort human relations into smut.

This is not just disrespectful, it confirms Schumer’s project of cultural takeover,

she aims to acquire cultural power

Schumer disguises a noxious cultural agenda as personal fiat. She’s a comedy demagogue who okays modern misbehavior yet blatantly revels in PC notions about feminism, abortion, and other hot-button topics


I should add that not all conservative publications felt so threatened. Joe Morganstern at the Wall Street Journal gave the movie a warm review. Breitbart saw the movie’s essential conservatism (“The anti-slut message is a healthy one”) and praised Schumer as a comic actor.  Still, the National Review piece seems emblematic of something broader in the cultural conservative camp: a taboo-like reaction to female sexuality.

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Recently there’s been heightened attention to calling out microaggressions and giving trigger warnings. I recently speculated that the loudest voices making these demands come from people in categories that have gained in power but are still not dominant, notably women at elite universities.  What they’re saying in part is, “We don’t have to take this shit anymore.” Or as Bradley Campbell and Jason Manning put it in a recently in The Chronicle:

…offenses against historically disadvantaged social groups have become more taboo precisely because different groups are now more equal than in the past.

It’s nice to have one’s hunches seconded by scholars who have given the issue much more thought.

Campbell and Manning make the context even broader. The new “plague of hypersensitivity” (as sociologist Todd Gitlin called it) isn’t just about a shift in power, but a wider cultural transformation from a “culture of dignity” to a “culture of victimhood.” More specifically, the aspect of culture they are talking about is social control. How do you get other people to stop doing things you don’t want them to do – or not do them in the first place?

In a “culture of honor,” you take direct action against the offender.  Where you stand in society – the rights and privileges that others accord you – is all about personal reputation (at least for men). “One must respond aggressively to insults, aggressions, and challenges or lose honor.” The culture of honor arises where the state is weak or is concerned with justice only for some (the elite). So the person whose reputation and honor are at stake must rely on his own devices (devices like duelling pistols).  Or in his pursuit of personal justice, he may enlist the aid of kin or a personalized state-substitute like Don Corleone.

In more evolved societies with a more extensive state, honor gives way to “dignity.”

The prevailing culture in the modern West is one whose moral code is nearly the exact opposite of that of an honor culture. Rather than honor, a status based primarily on public opinion, people are said to have dignity, a kind of inherent worth that cannot be alienated by others. Dignity exists independently of what others think, so a culture of dignity is one in which public reputation is less important. Insults might provoke offense, but they no longer have the same importance as a way of establishing or destroying a reputation for bravery. It is even commendable to have “thick skin” that allows one to shrug off slights and even serious insults, and in a dignity-based society parents might teach children some version of “sticks and stones may break my bones, but words will never hurt me” – an idea that would be alien in a culture of honor.

The new “culture of victimhood” has a different goal – cultural change. Culture is, after all, a set of ideas that is shared, usually so widely shared as to be taken for granted. The microaggression debate is about insult, and one of the crucial cultural ideas at stake is how the insulted person should react. In the culture of honor, he must seek personal retribution. In doing so, of course, he is admitting that the insult did in fact sting. The culture of dignity also focuses on the character of offended people, but here they must pretend that the insult had no personal impact. They must maintain a Jackie-Robinson-like stoicism even in the face of gross insults and hope that others will rise to their defense. For smaller insults, say Campbell and Manning, the dignity culture “would likely counsel either confronting the offender directly to discuss the issue,” which still keeps things at a personal level, “or better yet, ignoring the remarks altogether.”

In the culture of victimhood, the victim’s goal is to make the personal political.  “It’s not just about me…”  Victims and their supporters are moral entrepreneurs. They want to change the norms so that insults and injustices once deemed minor are now seen as deviant. They want to define deviance up.  That, for example, is the primary point of efforts like the Microaggressions Project, which describes microaggressions in exactly these terms, saying that microaggression “reminds us of the ways in which we and people like us continue to be excluded and oppressed” (my emphasis).


So, what we are seeing may be a conflict between two cultures of social control: dignity and victimhood. It’s not clear how it will develop. I would expect that those who enjoy the benefits of the status quo and none of its drawbacks will be most likely to resist the change demanded by a culture of victimhood. It may depend on whether shifts in the distribution of social power continue to give previously more marginalized groups a louder and louder voice.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

I was on jury duty this week, and the greatest challenge for me was the “David Brooks temptation” to use the experience to expound on the differences in generations and the great changes in culture and character that technology and history have brought.

I did my first tour of duty in the 1970s. Back then you were called for two weeks. Even if you served on a jury, after that trial ended, you went back to the main jury room. If you were lucky, you might be released after a week and a half. Now it’s two days.

What most struck me most this time was the atmosphere in the main room. Now, nobody talks. You’re in a large room with maybe two hundred people, and it’s quieter than a library. Some are reading newspapers or books, but most are on their latops, tablets, and phones. In the 1970s, it wasn’t just that there was no wi-fi, there was no air conditioning. Remember “12 Angry Men”? We’re in the same building. Then, you tried to find others to talk to. Now you try to find a seat near an electric outlet to connect your charger.

2 (1)

I started to feel nostalgic for the old system. People nowadays – all in their own narrow, solipsistic worlds, nearly incapable of ordinary face-to-face sociability. And so on.

But the explanation was much simpler. It was the two-day hitch. In the old system, social ties didn’t grow from strangers seeking out others in the main jury room. It happened when you went to a courtroom for voir dire. You were called down in groups of forty. The judge sketched out the case, and the lawyers interviewed the prospective jurors. From their questions, you learned more about the case, and you learned about your fellow jurors – neighborhood, occupation, family, education, hobbies. You heard what crimes they’d been a victim of.  When judge called a break for bathroom or lunch or some legal matter, you could find the people you had something in common with. And you could talk with anyone about the case, trying to guess what the trial would bring. If you weren’t selected for the jury, you went back to the main jury room, and you continued the conversations there. You formed a social circle that others could join.

This time, on my first day, there were only two calls for voir dire, the clerk as bingo-master spinning the drum with the name cards and calling out the names one by one. My second day, there were no calls. And that was it. I went home having had no conversations at all with any of my fellow jurors. (A woman seated behind me did say, “Can you watch my laptop for a second?” when she went to the bathroom, but I don’t count that as a conversation.)

I would love to have written 800 words here on how New York character had changed since the 1970s.  No more schmoozing. Instead we have iPads and iPhones and MacBooks destroying New York jury room culture – Apple taking over the Apple. People unable or afraid to talk to one another because of some subtle shift in our morals and manners. Maybe I’d even go for the full Brooks and add a few paragraphs telling you what’s really important in life.

But it was really a change in the structure. New York expanded the jury pool by eliminating most exemptions. Doctors, lawyers, politicians, judges – they all have to show up. As a result, jury service is two days instead of two weeks, and if you actually are called to a trial, once you are rejected for the jury or after the trial is over, you go home.

The old system was sort of like the pre-all-volunteer army. You get called up, and you’re thrown together with many kinds of people you’d never otherwise meet. It takes a chunk of time out of your life, but you wind up with some good stories to tell. Maybe we’ve lost something. But if we have lost valuable experiences, it’s because of a change in the rules, in the structure of how the institution is run, not a because of a change in our culture and character.

Cross-posted  at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The governors of Virginia and South Carolina have now taken stands against the Confederate battle flag. So have honchos at Wal*Mart, Sears, Target, and NASCAR.

NASCAR! How could this cascade of reversals have happened so rapidly? Did these important people wake up one morning this week and say to themselves, “Gee, I never realized that there was anything racist about the Confederacy, and never realized that there was anything wrong with racism, till that kid killed nine Black people in a church”?

My guess is that what’s going on is not a sudden enlightenment or even much of a change in views about the flag. To me it looks more like the process of “pluralistic ignorance.” What these people changed was not their ideas about the Confederacy or racism but their ideas about other people’s ideas about these matters. With pluralistic ignorance (a term coined by Floyd Allport nearly a century ago) everyone wants X but thinks that nobody else does. Then some outside factor makes it possible for people to choose X, and everyone does. Everyone is surprised – “Gee, I thought all you guys wanted Y, not X .” It looks like a rapid change in opinion, but it’s not.

A few years ago in places like Ireland and Europe, people were surprised at the success of new laws banning smoking in pubs and restaurants. “Oh, the smokers will never stand for it.” But it turned out that the smokers, too, were quite happy to have rooms with breathable air. It’s just that before the laws were passed, nobody knew that’s how other people felt because those people kept smoking.

The same thing happened when New York City passed a pooper-scooper law. “The law is unenforceable,” people said. “Cops will never see the actual violation, only its aftermath. And do you really think that those selfish New Yorkers will sacrifice their own convenience for some vague public good?” But the law was remarkably effective. As I said in this post from 2009:

Even before the new law, dog owners had probably thought that cleaning up after their dogs was the right thing to do, but since everyone else was leaving the stuff on the sidewalk, nobody wanted to be the only schmuck in New York to be picking up dog shit. In the same way that the no-smoking laws worked because smokers wanted to quit, the dog law in New York worked because dog owners really did agree that they should be cleaning up after their dogs. But prior to the law, none of them would speak or act on that idea.

In South Carolina and Georgia and Bentonville, Arkansas and elsehwere, the governors and the CEOs surely knew that the Confederacy was based on racist slavery; they just rarely thought about it. And if the matter did come up, as with the recent Supreme Court decision about license plates, they probably assumed that most of their constituents and customers were happy with the flag and that the anti-flaggers were a cranky minority.

With the support for letting that flag fade into history, it looks as though for a while now many Southerners may have been uncomfortable with the blatant racism of the Confederacy and the post-Reconstruction era. But because nobody voiced that discomfort, everyone thought that other Southerners still clung to the old mentality. The murders in the Charleston church and the subsequent discussions about retiring the flag may have allowed Southerners to discover that their neighbors shared their misgivings about the old racism. And it allowed the retail giants to see that they weren’t going to lose a lot of money by not stocking the flag.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.