public opinion

The partial U.S. map below shows the proportion of the population that was identified as enslaved in the 1860 census.  County by county, it reveals where the economy was most dominated by slavery.

1

A new paper by Avidit Acharya, Matthew Blackwell, and Maya Sen has discovered that the proportion of enslaved residents in 1860 — 153 years ago — predicts race-related beliefs today.   As the percent of the population in a county accounted for by the enslaved increases, there is a decreased likelihood that contemporary white residents will identify as a Democrat and support affirmative action, and an increased chance that they will express negative beliefs about black people.

Avidit and colleagues don’t stop there.  They try to figure out why.  They consider a range of possibilities, including contemporary demographics and the possibility of “racial threat” (the idea that high numbers of black people make whites uneasy), urban-rural differences, the destruction and disintegration caused by the Civil War, and more.  Controlling for all these things, the authors conclude that the results are still partly explained by a simple phenomenon: parents teaching their children.  The bias of Southern whites during slavery has been passed down intergenerationally.

Cross-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

A single event can take on great symbolic importance and change people’s perceptions of reality, especially when the media devote nearly constant attention to that event.  The big media story of the killing of Trayvon Martin and the trial of George Zimmerman probably does not change the objective economic, social, and political circumstances of Blacks and Whites in the U.S.  But it changed people’s perceptions of race relations.

A recent NBC/WSJ poll shows that between November of 2011 and July 2013, both Whites and Blacks became more pessimistic about race relations.

1 2

Since 1994, Americans had become increasingly sanguine about race relations.  The Obama victory in 2008 gave an added boost to that trend.  In the month of Obama’s first inauguration, nearly two-thirds of Blacks and four-fifths of Whites saw race relations as Good or Very Good (here’s the original data). But now, at least for the moment, the percentages in the most recent poll are very close to what they were nearly 20 years ago.

The change was predictable, given the obsessive media coverage of the case and the dominant reactions to it.  On one side, the story was that White people were shooting innocent Black people and getting away with it.  The opposing story was that even harmless looking Blacks might unleash potentially fatal assaults on Whites who are merely trying to protect their communities.  In both versions, members of one race are out to kill members of the other — not a happy picture of relations between the races.

My guess is that Zimmerman/Martin effect will have a short life, perhaps more so for Whites than Blacks. In a few months, some will ascend from the depths of pessimism. Consider that after the verdict in Florida there were no major riots, no burning of neighborhoods to leave permanent scars — just rallies that were for the most part peaceful outcries of anger and anguish.  I also, however, doubt that we will see the optimism of 2009 for a long while, especially if the employment remains at its current dismal levels.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Here’s an interesting new wrinkle in the data on support for same sex marriage.  According to Gallup, 53% of Americans now favor such marriages, but we don’t necessarily think other people do.  Overall, Americans, on average, think that 63% of their fellow citizens oppose same sex marriage; in fact, 45% do.  That’s an over-estimate of 18 percentage points!

2

Interestingly, Americans of all stripes — Democrat and Republican, liberal and conservative, old and young — underestimate support for same sex marriage.  Liberals come the closest, thinking that 48% approve; conservatives are the farthest off, thinking that only 16% do.

This data resonates with the recent finding that both Democratic and Republican politicians underestimate their constituents’ progressiveness.  I suspect that these misconceptions may make politicians wary about pressing for progressive policies; I wonder how similar misconceptions among the voting public might shape the pace and trajectory of social change.

h/t @tylerkingkade. Cross-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at PolicyMic, Huffington Post, BlogHer, and Pacific Standard.

Writer Peg Streep is writing a book about the Millennial generation and she routinely sprinkles great data into her posts at Psychology Today.  

Recently she linked to at study by Net Impact that surveyed currently-enrolled college students and college-graduates across three generations Millennials, Gen Xers, and Baby Boomers.  The questions focused on life goals and work priorities.  They found significant differences between students and college grads, as well as interesting generational differences.

First, students have generally higher demands on the world; they are as likely or more likely than workers to say that a wide range of accomplishments are “important or essential to [their] happiness”:

In particular, students are more likely than workers to say it is important or essential to have a prestigious career with which they can make an impact.  More than a third think that this will happen within the next five years:

Wealth is less important to students than prestige and impact.  Over a third say they would take a significant pay cut to work for a company committed to corporate social responsibility (CSR), almost half for a company that makes a positive social or environmental impact, and over half to align their values with their job:

Students stand out, then, in both the desire to be personally successful and to make a positive contribution to society.

At the same time, they’re cynical about other people’s priorities.  Students and Millennials are far more likely than Gen Xers or Boomers to think that “people are just looking out for themselves.”

This data rings true to this college professor.  Despite the recession, the students at my (rather elite, private, liberal arts) school surprise me with their high professional expectations (thinking that they should be wildly successful, even if they’re worried they won’t be) and their desire to change the world (many strongly identify as progressives who are concerned with social inequalities and political corruption).

Some call this entitlement, but I think it’s at least as true to say that today’s college youth (the self-esteem generation) have been promised these things.  They’ve always been told to dream big, and so they do.  Unfortunately, I’m afraid that we’ve sold our young people a bill of goods.  Their high expectations sound like a recipe for disappointment, even for my privileged population, especially if they expect it to happen before they exit their twenties!

Alternatively, what we’re seeing is the idealism of youth.  It will be interesting to see if they downshift their expectations once they get into the workforce.  Net Impact doesn’t address whether these are largely generational or age differences.  It’s probably a combination of both.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at Montclair SocioBlog.

Does “the abortion culture” cause infanticide?  That is, does legalizing the aborting of a fetus in the womb create a cultural, moral climate where people feel free to kill newborn babies?

It’s not a new argument.  I recall a 1998 Peggy Noonan op-ed in the Times, “Abortion’s Children,” arguing that kids who grew up in the abortion culture are “confused and morally dulled.”*  Earlier this week, USA Today ran an op-ed by Mark Rienzi repeating this argument in connection with the Gosnell murder conviction.

Rienzi argues that the problem is not one depraved doctor.  As the subhead says:

The killers are not who you think. They’re moms.

Worse, he warns, infanticide has skyrocketed.

While murder rates for almost every group in society have plummeted in recent decades, there’s one group where murder rates have doubled, according to CDC and National Center for Health Statistics data — babies less than a year old.

Really? The FBI’s Uniform Crime Reports has a different picture.

1

Many of these victims were not newborns, and Rienzi is talking about day-of-birth homicides — the type killing Dr. Gosnell was convicted of, a substitute for abortion.  Most of these, as Rienzi says are committed not by doctors but by mothers.  I make the assumption that the method in most of these cases is smothering.  These deaths show an even steeper decline since 1998.

2

Where did Rienzi get his data that rates had doubled?  By going back to 1950.

3

The data on infanticide fit with his idea that legalizing abortion increased rates of infanticide.  The rate rises after Roe v. Wade (1973) and continues upward till 2000.

But that hardly settles the issue. Yes, as Rienzi says, “The law can be a potent moral teacher.”  But many other factors could have been affecting the increase in infanticide, factors much closer to actual event — the mother’s age, education, economic and family circumstances, blood lead levels, etc.

If Roe changed the culture, then that change should be reflected not just in the very small number of infanticides but in attitudes in the general population.  Unfortunately, the GSS did not ask about abortion till 1977, but since that year, attitudes on abortion have changed very little.   Nor does this measure of “abortion culture” have any relation to rates of infanticide.

4

Moreover, if there is a relation between infanticide and general attitudes about abortion, then we would expect to see higher rates of infanticide in areas where attitudes on abortion are more tolerant.

5

The South and Midwest are most strongly anti-abortion, the West Coast and Northeast the most liberal.  So, do these cultural difference affect rates of infanticide?

1

Well, yes, but it turns out the actual rates of infanticide are precisely the opposite of what the cultural explanation would predict.  The data instead support a different explanation of infanticide: Some state laws make it harder for a woman to terminate an unwanted pregnancy.  Under those conditions, more women will resort to infanticide.  By contrast, where abortion is safe, legal, and available, women will terminate unwanted pregnancies well before parturition.

The absolutist pro-lifers will dismiss the data by insisting that there is really no difference between abortion and infanticide and that infanticide is just a very late-term abortion. As Rienzi puts it:

As a society, we could agree that there really is little difference between killing a being inside and outside the womb.

In fact, very few Americans agree with this proposition. Instead, they do distinguish between a cluster of a few fertilized cells and a newborn baby. I know of no polls that ask about infanticide, but I would guess that a large majority would say that it is wrong under all circumstances.  But only perhaps 20% of the population thinks that abortion is wrong under all circumstances.

Whether the acceptance of abortion in a society makes people “confused and morally dulled” depends on how you define and measure those concepts.  But the data do strongly suggest that whatever “the abortion culture” might be, it lowers the rate of infanticide rather than increasing it.

* I had trouble finding Noonan’s op-ed at the Times Website.  Fortunately, then-Rep. Talent (R-MO) entered it into the Congressional Record.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Scholars suggest that studying abroad in a previously-colonized country may increase people’s cultural sensitivity and awareness of global inequality.  I investigated this hypothesis by interviewing college students: one group had studied abroad for a semester or more, the other had only traveled out of the country for vacation.

I asked both groups to view and analyze fashion photography that contrasted models with more humble images of residents of less developed countries.  I hoped people would point to how, by contrasting glamorous, thin, conventionally-attractive White models with “average” people from less-privileged countries served to heighten the high status of the West and their representatives.  I saw this as a form of Western “slumming”: a practice of spending time in places or with people who are “below” you, out of curiosity or for fun or personal development.

My findings revealed that study abroad students think they’re more culturally competent but, in fact, they were no more likely than people who had never studied abroad to express concern about the exploitation of previously colonized people in ads like these.

The majority of students from both groups – those who’d studied abroad and those who hadn’t — demonstrated a distinct lack of concern.  They unreflexively “Othered” the people in these images; that is, they affirmed the locals’ marginalized group status and labeled them as being Other, belonging outside of our normative Western structure.

The majority also expressed approval of the aesthetics of the ads without irony. For example, one student said: “I think it works because it’s this edgy, culturally stimulating, and aesthetically pleasing ad.” When asked the art director’s intentions, another student commented: “I don’t know. Just like ordinary people next to someone who’s on top of their fashion game.”

Only select few students successfully observed the use of Othering in the images. When asked the art director’s intentions of one image, a student replied: “I think it’s to contrast the model with the everyday life of these people…  (it) feels more like an image of people of color being an accessory.” Noticing this theme, interestingly, did not correlate with having studied abroad, in contrast to my hypothesis.

My findings suggest, then, that living abroad for a semester or more in a previously colonized country does not necessarily contribute to the detection of global inequality in fashion photography.

Erica Ales is a senior Sociology major at Occidental College in Los Angeles, California.

1Research has shown that college students largely think that asking for sexual consent — “Do you want to have sex?” — “ruins the mood.”  This is partly because it violates their sexual script, the norms and expectations that guide sexual encounters.

If explicit consent violates the sexual script, then students are left trying to discern consent from more subtle and implicit verbal and non-verbal cues.  I did a research project to determine how they do this, interviewing 19 college students about their perceptions of sexual consent in popular television programs. 

I discovered that students often interpreted the same scenes dramatically differently. For example, I showed them this scene from The Vampire Diaries:

Eleven of my 19 respondents brought up the issue of verbal consent.  Five said the verbal interchange in the scene indicated consent; six said it did not.  Their contrasting perceptions focused on the male character’s statement, “Let’s get out of here.”  The five students who saw the scene as consensual were inclined to classify the declaration, “Let’s get out of here” as the moment where verbal consent is given.  For example, Hannah said:

…like I mean he doesn’t outright say “do you wanna have sex” but he says “do you want to get out of here” and she’s like “yes.”  That’s like the only one where there’s like an actual yes! [giggling] I mean like a verbal yes.

Hannah said the scene indicated consent because she equated “getting out of here” with sex.

In contrast, Natalie and five others disagreed with Hannah and those who considered the verbal exchange between Tyler and Caroline to be a form of verbal consent:

No, I would say, there was like no talk of consent, really… In the Vampire Diaries one, by him saying like, “let’s get out of here,” there might be an assumption associated with that and then her saying, “Okay,” like could be consent, quote, unquote.  But, I don’t really think that qualifies, either.

Natalie believed there was a correct way to obtain verbal consent.  When I asked her what would make this scene consensual, Natalie replied, “Basically saying ‘Do you want to, do you want to go through with this?’—something like that.”  Obviously, Natalie viewed consent as a different kind of verbal question.

The differences in these responses to The Vampire Diaries scene are striking. While verbal consent is often held up as the gold standard, I found disagreement as to exactly which statements constitute consent.  This disagreement sets the stage for serious miscommunication about students’ sexual intentions.  Some students interpret a phrase such as “Do you want to leave?” as “Do you want to leave this party and have sex at my house?” while other students believe that only a phrase such as “Do you agree to have sex with me?” communicates sexual consent.

Nona Gronert will graduate from Occidental College this May with a degree in Sociology and Spanish Literary Studies.  She aspires to become a professor of Sociology.

A guiding principle driving the sociological understanding and analysis of deviance is the recognition that behaviors themselves are not inherently deviant; rather it is the social perceptions and reactions to a behavior that makes a particular behavior deviant.  This explains why opinions and attitudes towards different forms of supposedly deviant behaviors regularly change.  A notable change in one type of deviance, using marijuana, is revealed in a report compiled by the Pew Research Center.

According to David F. Musto, a century ago marijuana was an obscure drug used almost exclusively by Hispanics in the Southwest.  Its limited association with this ethnic group is largely why marijuana initially became illegal.  With the onset of the Great Depression, both federal and state governments sought ways to expel nonwhites from the country as their cheap labor was no longer necessary.  Making one of this group’s pastimes illegal was a way to stigmatize Hispanics and rally public support for a population transfer.  With a populace stirred into a moral panic by racism, nativism and propaganda movies like Reefer Madness, there was little resistance to the 1937 Marijuana Tax Act which effectively made cannibas illegal.

In the 1960s marijuana experienced a cultural comeback when it became the drug of choice for baby-boomers who saw the drug as a safer alternative to the alcohol and methamphetamine that plagued their parents’ generation.  Marijuana was even legal for a brief period after the Supreme Court found the 1937 marijuana act unconstitutional.  However, because of widespread concern that drugs were corrupting the moral fabric of America’s youth, in 1970 marijuana was one of many drugs outlawed by President Nixon’s Comprehensive Drug Abuse Prevention and Control Act.  Interestingly, marijuana was the only drug targeted by this act that did not include a medical exception.  In the 1980s, President Reagan increased penalties for breaking drug laws, and subsequently the prison population in the United States swelled to a size seemingly unimaginable in a wealthy democracy.

The graph below from PEW’s report captures how federal action came during times of heightened public support to make marijuana illegal.

1

Yet, the graph also captures how in the early 1990s, support for the legalization of marijuana started to increase.  According to the PEW report, around this time California pioneered using the drug for medicinal purposes; seventeen other states (including D.C.) have since followed California’s lead while six other states decriminalized possession of small amounts.  In 2012, citizens in Colorado and Oregon voted to completely legalize marijuana despite federal law.  This relaxing and even elimination of marijuana laws mirrors favorable opinions of marijuana and growing support for its legalization.

It is difficult to tell if legalization, medical or otherwise, drives public opinion or vice-versa.  Regardless, an especially noteworthy finding of the PEW report is that right now, more than half of the United States’ citizens think marijuana should be legal.  Sociologists always take interest when trend lines cross in public opinion polls because the threshold is especially important in a majority-rule democracy; and the PEW report finds for the first time in the history of the poll, a majority of U.S. citizens support marijuana legalization.

This historical research data on opinions about marijuana reveals how definitions of deviance, and in many cases the ways those definitions are incorporated into the legal system, grow out of shared social perceptions.  Although there have been some notable genetic and cultivation advances, marijuana has changed relatively little in the last forty years; yet our perceptions of this drug (and therefore its definitions of use as deviant) regularly evolve and we can expect opinions, and therefore our laws, to further change in the future.

Jason Eastman is an Assistant Professor of Sociology at Coastal Carolina University who researches how culture and identity influence social inequalities.