Research on poverty often focuses on economic differences between married and nonmarried people, especially mothers. However, in a recent New York Times op-ed, David Brady, Ryan M. Finnigan and Sabine Hübgen push back against common cultural arguments, like criticism about having children outside of marriage, that blame single mothers for poverty. Instead, the authors ask why the United States responds to economic struggle with stigmatization and punishment rather than assistance.

Photo Credit: Taymaz Valley, Flickr CC

The op-ed follows their recent American Journal of Sociology article, which used data from the Luxemborg Income Study to find that single motherhood is both more rare and less consequential for the poverty rate than would be expected based on the popular imagination. The op-ed reads,  

“If single motherhood in the United States were in the middle of the pack among rich democracies instead of the third highest, poverty among working-age households would be less than 1 percentage point lower — 15.4 percent instead of 16.1 percent. If we returned to the 1970 share of single motherhood, poverty would decline a tiny amount — from 16.1 percent to 15.98. If, magically, there were no single mothers in the United States, the poverty rate would still be 14.8 percent.”

Instead of single motherhood as a factor that increases poverty, Brady, Finnigan, and Hubgen claim that political choices in the United States punish single motherhood — as well as the other poverty risk factors of unemployment, low levels of education and forming households at young ages — more than other rich democracies.

“The reality is we have unusually high poverty because we have unusually high penalties for all four of these risk factors. For example, if you lack a high school degree in the United States, it increases the probability of your being in poverty by 16.4 percent. In the 28 other rich democracies, a lack of education increases the probability of poverty by less than 5 percent on average. No other country penalizes the less educated nearly as much as we do.”

Photo by Ben Sutherland, Flickr CC

From the Super Bowl to March Madness, sporting celebrations raise questions about rioting every year. After their Super Bowl victory in February, Eagles fans took to the streets, looting and toppling light poles. A recent article in the Washington Post delves into the sociological and psychological explanations for why fans are often violent and destructive after a massive victory.

Sociologist Jerry M. Lewis has studied fan violence for decades, looking at the statistics on sport fan riots since the 1960s . He notes that fan violence in the United States usually consists of people destroying inanimate objects, while in other countries, and especially Europe, violence is directed toward opposing fan bases. Lewis explains that the sports rioting in the US almost always happens after a major victory rather than a loss, as a form of identification with the victorious team.

In the US, sports rioters tend to be young, white males. These passionate fans react excitedly to their favorite teams, reveling in victory and adrenaline which, in this case, results in destruction of city property. Lewis expands,

“They can’t throw a football 60 yards like the quarterback can, but they can throw a rock through the window or pull down a light pole. To them, it becomes their feat of strength and skill.”

Lewis provides an interesting contrast in the public perceptions of sports rioters compared to those who protest or riot because of social upheaval. Media and public understandings of riots seem to depend on who participates, and what is often described as a riot is defined along racial lines. Because of this, sports fans can celebrate while they cause destruction, but protesters often reap the disadvantages.

Courtesy the Boston Public Library.
Courtesy the Boston Public Library.

Primary season already feels interminable, and it looks like, among Republicans, Donald Trump is pulling ahead with wins in Nevada, South Carolina, and New Hampshire. The results are perplexing for our typical narratives about conservative politics for a number of reasons, but one of the most striking is that he appears to be doing pretty well with evangelical Christian voters, despite being not terribly religious himself (including a recent flub over “two Corinthians”).

Ted Cruz is a much more committed evangelical candidate. A recent piece in New Republic looks at “How Ted Cruz Lost the Evangelical Vote,” and draws on research from sociologist Lydia Bean on how a simple narrative about conservative religion and conservative politics doesn’t quite fit the reality of contemporary evangelicalism. According to the article:

Bean points out that evangelicals differ not only in their politics—with some identifying as more conservative and others as more moderate—but in their religiosity.

“Evangelicals who don’t go to church very much but identify as Christian, with Christian nationalistic rhetoric, but aren’t very well formed or advised by Christian community leaders—they’re going for Trump,” Bean says. “I think Ted Cruz is picking up the older, more observant people who are theologically and politically conservative, the people who actually go to church every week.” Rubio, meanwhile, “is picking up the younger, more cosmopolitan evangelicals…”

The relationship between religion and politics is complicated, just like any other ideological system. The most interesting sociological point in Bean’s research, though, is how different styles of practice within similar religious communities can teach people to look at politics and their choices in different ways.

Photo by Howard Ignatius via Flickr.
Sociologists find that beliefs about global warming predict people’s temperature perceptions. Photo by Howard Ignatius via Flickr.

Winter is in full swing up here in Minneapolis, and with it comes the traditional chorus that “it isn’t that bad” just yet. However, new research shows complaining about the weather—the archetype of casual chatting—may be more than just small talk. The Washington Post reports on new research from Aaron McCrightRiley Dunlap, and Chenyang Xiao which finds a significant relationship between political affiliation and perceptions about the weather. From the article:

The paper…examined people’s perceptions of the winter of 2012, which was anomalously warm. Comparing Gallup polling results from early March 2012 (just after the winter ended) with actual temperature data…“The researchers found that ‘Democrats [were] more likely than Republicans to perceive local winter temperatures as warmer than usual”…beliefs about global warming also predicted temperature perceptions.

It may have been one of the warmest years on record, but this work shows that partisanship affected who actually felt warmer than usual. We’ve known about socialization for a long time— many researchers study how social groups teach people to act in certain ways—but this study is especially interesting because it shows how deeply political socialization can effect individuals. Later in the article Dunlap argues that “people have begun to filter their fundamental perceptions of what is going on…through a partisan frame.” Contrary to expectations, this also means firsthand experiences with extreme weather as the planet warms may not be enough to inspire widespread change for environmental protection. Looks like we’ll need more than small talk.

Ever wonder where weird Thanksgiving traditions come from? Photo by Musicwala via Flickr.
The Macy’s Thanksgiving Day parade has been held annually since 1924. Turns out some families’ holiday “rituals” are more common than you might think. Photo by Musicwala via Flickr.

Sociology loves making the familiar strange, and few events blend the familiar and the strange as artfully as holiday family gatherings. The Week recaps a classic sociological study of Thanksgiving celebrations by Melanie Wallendorf and Eric J. Arnould, which sheds light on just how common our “quirky” family rituals can be. A particularly juicy conclusion was that interview respondents didn’t realize their party quirks were actually ‘traditions’ happening year after year at gatherings across the U.S. According to the article:

…a society is not always the best judge of its own customs…The data analysis revealed some common events in the fieldnotes that people rarely remarked on in the interviews.

Common practices included “The Giving of The Job Advice,” “The Telling of Disaster Stories of Thanksgivings Past,” and the ever-popular “After-Dinner Stroll around the Neighborhood.” These customs remind us just how much we share at this time of year. Who knows? The next awkward family gathering just might be a new field site!
Photo by Dominique Faget/AFP for the Tico Times.
The “epidemic mindset” could be caused by the uncertainty of a global world. Photo by Dominique Faget/AFP.

Though there is still much work to be done to curb cases of Ebola across Liberia, Guinea, and Sierra Leone, good news came this week as the World Health Organization declared Nigeria Ebola-free. Yet fear of the disease remains around the world as Americans and Europeans call for travel restrictions to limit further exposure. Why all the fear for a disease with so few cases gone global?

The New York Times  interviewed sociologist Claudine Burton-Jeangros on the issue, who points out that Ebola fears fit into larger narratives about our place in the world and modern life.

…the more we master the world through science and technology the more frightened we are of those things we can’t control or understand. ”We live in very secure societies and like to think we know what will happen tomorrow. There is no place in our rational and scientific world for the unknown. Objectively, the risks created by Ebola in Europe are very small,” said Ms. Burton-Jeangros, ”but there is an uncertainty that creates fear.”

Since Ebola is only spread when bodily fluids are exchanged, the chances of an outbreak in the U.S. or Europe are very small. We’re not immune from fear, however, and the uncertainty of a global world creates new social supports for epidemics of anxiety. For more on the “epidemic mindset,” check out our roundup of research.

 

Photo by PressTV.
Young voters and people living in council areas with high unemployment were more likely to vote in favor of Scottish independence.  Photo by PressTV.

Despite preliminary polls showing the Scottish independence vote as too close to call, last week saw a decisive victory for keeping the nation part of the United Kingdom with a 10.6 percentage point lead. Now that the media has swung from predicting to explaining, The Guardian considers why the early polling was so far off the mark, pointing to early decisions for “no” among voters and anxiety over the economic impacts of independence.

Oxford sociologist Stephen Fisher weighed in on the post-vote analysis and pointed out two trends which help explain the outcome. First, economic concerns were closely related to decision patterns:

“…in all four councils won by Yes Scotland, unemployment rates are higher than the Scottish average… Better Together’s best results were in councils where unemployment rates were below the Scottish average.”

Second, despite widespread national conversation and high intentions to vote, actual turnout among “yes” voters wasn’t quite enough:

“Only in one of the four councils where yes came on top was turnout higher than the countrywide 84.6%. This indicates that the participation among groups that tend to historically vote less (or not at all), such as younger people, the unemployed and those living in more deprived areas, where yes was theoretically strongest, while far higher than normal, was not as high as expected.”

There is plenty more work to be done before we fully understand the outcome, but these preliminary findings remind us that the key challenge for any political movement is getting enough folks to move where and when it really counts.

 

But can He get you a job? Photo by David Woo via flickr CC.
But can He get you a job? Photo by David Woo via flickr CC.

It’s summer job hunt season. As a new batch of college grads looks for every edge on the market, sociologists have found a surprising barrier to getting hired: your religion. Vox and The Washington Post both picked up new research from Michael Wallace, Bradley R. E. Wright, and Allen Hyde, in which the authors distributed 3,200 resumes for job applications around two major southern U.S. cities (a follow up to earlier work in New England). The resumes were designed to look like those of recent college graduates, and they were essentially identical except for the applicants’ membership in a particular campus religious group. The authors found that putting any kind of religious affiliation on a resume reduced the chances that an applicant would receive a call back. From Vox:

Wallace said he thinks the US has a “schizophrenic attitude” when it comes to religion. “On the one hand, we have a high tolerance of religious freedom and diversity, people are free to practice whatever religion they want,” he told me in an interview. “On the other hand, there are certain boundaries on where it can be practiced.”

While including a religious affiliation did reduce call backs across the board, not every religious group faced the same barriers. Who faced the most hiring discrimination? According to the authors’ article:

In general, Muslims, pagans, and atheists suffered the highest levels of discriminatory treatment from employers, a fictitious religious group and Catholics experienced moderate levels, evangelical Christians encountered little, and Jews received no discernible discrimination.

These findings are consistent with other research and polling efforts to capture Islamophobia and anti-atheist attitudes in the United States, and they show that while employers may not enjoy religion in the workplace, we should also be concerned about which religious groups they will tolerate.

Courtesy the Boston Public Library.
Courtesy the Boston Public Library.

It’s been a busy time for social facts on religion in American life. First, The Washington Post reported new data from the Pew Forum suggesting that more Americans would be willing to vote for an atheist president. While the original report noted that atheism is still a “top negative” for voters—with more respondents saying it would make them less likely to vote for a candidate than drug use, political inexperience, or an extramarital affair—there is still some optimism in the fact that this number has declined by 10% since 2007.

Second, a new report from the Public Religion Research Institute found that Americans are still over-reporting their church attendance, moreso in phone than in online surveys. The Huffington Posthosted a roundtable on the issue, and a take in The Atlantic emphasized the political implications of this data—liberals are more likely to inflate their church attendance than conservatives, and this may be because of negative stereotypes that liberals are “anti-religion.”

In a journalistic trifecta, all three stories noted research from Minnesota sociologists Penny Edgell, Joseph Gerteis, and TSP’s own Doug Hartmann on the continued stigma faced by atheists in American culture. From The Atlantic:

When three University of Minnesota sociologists surveyed American religious attitudes in 2006, they found “not only that atheists are less accepted than other marginalized groups but also that attitudes toward them have not exhibited the marked increase in acceptance that has characterized views of other racial and religious minorities over the past forty years.” Americans are today more likely to say they would vote for a Muslim or a gay or lesbian for president than an atheist.

Edgell also discussed current trends in church attendance on The Huffington Post and updated her 2006 research in The Washington Post:

A 2006 study by University of Minnesota sociologist Penny Edgell found atheists were the most mistrusted minority in the U.S. Edgell said Tuesday that an updated study based on a 2014 online survey would be released soon. Preliminary results show the mistrust meter hasn’t budged.

Despite an inclusive trend in what Americans say they look for in a candidate, religious identities are still an important marker of who can lead the flock(s).

For more on the cultural factors that may be driving these trends, check out this classic TSP feature: The Social Functions of Religion in American Political Culture.

Photo by Kristine Lewis via flickr.com.
Photo by Kristine Lewis via flickr.com.

For many, the “American Dream” means owning a comfortable home in a nice neighborhood, and that idea brings a certain Mellencamp tune to mind.

The song nods to a deeper point: the history of American housing policy from the New Deal and the G.I. Bill onwards was often defined by who couldn’t get a little pink house. In fact, racial biases among policymakers and bureaucrats made it difficult or impossible for minorities to get support for housing in white neighborhoods (For a great account of this history, see Ira Katznelson’s book When Affirmative Action Was White, or his recent blog post over at The Scholars Strategy Network).

Today’s housing policies may be flipping the script on this story, but not necessarily in a good way.

The Atlantic Cities reports new research from NYU Sociologist Jacob Faber on the 2006 housing bubble that preceded the massive economic crash and kickoff to the U.S. “Great Recession” in 2008. It turns out that during this bubble, in addition to denying home loans to racial minority groups, banks were also targeting minority groups for lower quality loans. The article reports:

Black and Hispanic families making more than $200,000 a year were more likely on average to be given a subprime loan than a white family making less than $30,000 a year… blacks were 2.8 times more likely to be denied for a loan, and Latinos were two times more likely. When they were approved, blacks and Latinos were 2.4 times more likely to receive a subprime loan than white applicants.

Faber adds that the trend doesn’t just deny support to these minority groups, it actually ignores their financial successes.

…this data offers another illustration that middle-class blacks have often not been able to leverage their income status for the same benefits as middle-class whites.

Ain’t that America?