Rachel E. Dwyer, Randy Hodson, and Laura McCloud, “Gender, Debt, and Dropping Out of College,” Gender & Society, 2013

College attendance, access to loans, and higher education are all gendered experiences influenced by inequalities—and so is the significant debt that often accompanies college. In a recent article, Rachel E. Dwyer, Randy Hodson, and Laura McCloud (Gender & Society, February 2013) explore how debt influences dropout rates and how men and women make decisions about each differently.

The authors find that men are less likely to take out student loans and that men drop out of college at lower levels of debt than women. The authors explain these findings by examining the effects of gendered occupational segregation and the gender pay gap. Because women and men face different labor market opportunities, their assessments of whether a college degree is worth the debt also differ.

When it comes to jobs that do not require a college degree, women and men are segregated into different types of work and men make significantly more money than women. For example, female dropouts tend to work in service and clerical jobs, while male dropouts work in higher-paid manufacturing, construction, and transportation positions. The consequences of dropping out of college, then, are greater for women, while it’s a more viable option for men to drop out before acquiring excessive debt.

With a college degree, men and women work more similar jobs and have more similar incomes. Still, even if they stay in college and graduate, women are less able to pay back student loans and get ahead because of the wage gap.

In nearly half of all U.S. states, it is a felony for HIV-positive people to have sex without disclosing their status to their partners. In some places, this law, meant to promote public health, has become a tool of social control. Those who have—or are suspected of having—HIV or AIDS are essentially kept under surveillance and can be criminally sanctioned for various violations.

Trevor Hoppe (Social Problems, February 2013) interviewed 25 health officials responsible for managing “health threat” cases in Michigan, where the laws are particularly strenuous. When new HIV-positive individuals are identified, officials do extensive contact tracing. While surveillance technologies are officially about disease prevention, they are also used to aid law enforcement and to regulate the client’s sexual practices. If an individual is labeled a “health threat,” they may be forced to undergo testing, counseling, treatment, or be quarantined. HIV-positive individuals may not be allowed to have any unprotected sex, even if they have disclosed their status to their partner (and if they test positive for a secondary STI, that is taken as evidence of unprotected sex). The law also treats all types of sex as equally risky, criminalizing even those sexual acts that carry no risk of transmission.

The criminal punishment for non-disclosure also provides impetus for local rumor mills, often setting in motion a “witch hunt.” Community members can call in confidential third party reports accusing individuals they suspect are HIV positive of not disclosing. These accusations often come against already-stigmatized individuals and may be false reports, but they set investigations in motion.

The additional stigma and social costs attached to an HIV diagnosis in states with such legislation may now be reducing people’s willingness to be tested for STIs at all, thus rendering a public health effort bad for public health.

A lot of 2008 election analysis focused on prejudice and race—would white Americans vote for a black president? In his recent Public Opinion Quarterly piece, Seth Goldman turns this question around to ask how the massive reach of the Obama campaign affected racial prejudice. He shows that, in just six months, the “Obama Effect” reduced racial prejudice at a rate five times faster than the average drop in racism over the entire previous twenty years. Because the same people were polled at various times during the Obama campaign, Goldman was able to measure individual- rather than group-level changes.

Unexpectedly, this effect was strongest among McCain supporters, especially those who watched political television shows. The effect was even stronger in states where the Obama campaign aired an influx of television advertisements. Watching TV didn’t change Republicans’ political views or swing their vote. Instead, seeing Obama challenged their expectations of black Americans by offering a positive image and countering stereotypes. Television is where media acts as a point of “virtual” contact between racial groups—and as Goldman argues, it can reduce prejudice as effectively as a face-to-face encounter.

On July 7, 2005, four British Muslim young men from the Leeds area detonated bombs on the London transportation system killing over fifty people. In the wake of these 7/7 bombings, politicians and academics worried that incidents of racism and Islamophobia against British South Asians perceived as Muslim would dramatically increase. Demonstrating a commonsense yet novel methodology, Yasmin Hussain and Paul Bagguley (Racial and Ethnic Studies, January 2013) interviewed forty British Pakistani Muslims to gauge post-7/7 racist or Islamophobic incidents, rather than replicate social science research that measures white, non-Muslim respondents’ changes in attitudes toward Muslims.

Among the findings, Hussain and Bagguley report that instead of outright violent incidents, most manifestations of racism and Islamophobia were much more subtle and patterned, experienced as “funny looks” from (mainly white) non-Muslim strangers. Drawing from the slang British connotation of “funny” as peculiar or slightly hostile, these looks were aimed particularly at young South Asians who were recognizably Muslim, wearing more traditional forms of dress. Women with headscarves disproportionately experienced funny looks, although respondents of all genders drew these looks if they were carrying a bag or backpack in public.

Responding with increased self-policing, many young Muslims said they’d become more intentional about when and where they travelled in order to insulate themselves from hostility and potential violence. Some even stopped wearing traditional dress. Other respondents disregarded these issues as an act of resistance and assertion of their identity as British Muslims. Hussain and Bagguley’s study reminds us that racism and prejudice is often not experienced directly through verbal or physical attacks but rather manifested in racial micro-aggressions that are difficult to quantify.

The hunt for “pink Viagra”—a medical solution to women’s so-called sexual dysfunction, identified as an official disorder in 1999—has so far proven fruitless. Sociologists Cristalle Pronier and Elizabeth Monk-Turner suggest in the Journal of Gender Studies that we stop looking. Instead, we need to consider the relational aspects of sex that many women require for satisfaction.

After surveying more than 300 female students, staff, and faculty in university community, Pronier and Monk-Turner found that social factors such as feeling intimacy, sexual agency, emotional closeness, and low levels of stress were key to women’s self-reported sexual satisfaction. Contrary to the pharmaceutical mantra “a pill for every ill,” these researchers believe female friskiness (or at least arousal) has fairly little to do with rerouting blood flow.

The zoo: a chance to leave the over-regulated world of concrete and bureaucracy behind and reinvigorate the spirit through witnessing exotic animals in their natural splendor. However, as David Grazian reminds us in The Sociological Quarterly, it takes a lot of planning to produce the “natural”.

Drawing on three years and over 500 hours of volunteering at two metropolitan zoos, Grazian provides detailed insights into the tensions that are negotiated on a daily basis by the “nature makers” charged with designing displays that fit how visitors imagine jungles, grasslands, and other untamed settings. This involves creating landscapes that imitate the fauna from far-away lands, while simultaneously enclosing the animals and separating them from human visitors… all without making the zoo look too much like a prison. Audiences must also be convinced that the animals’ activities are unaltered by captivity even as the taboo—sex, killing, and defecating—is censored. Because, hey, even being a wild animal is no excuse for poor manners.

So next time you are strolling your local big cat house, take some time to think about the constant planning and negotiation necessary to create an experience that is wild but not too wild, dangerous but not too dangerous, cute but not too cute, educational but not too educational, civilized but not too civilized, and most important of all, “natural”.

Even alternative media reporting on the housing crisis are using mainstream ways of talking about the problem. While you’d expect publications like BE, The Root, and Colorlines to be more radical (alternatives to, say, Forbes), instead they stick with “neoliberal” and “postracial” themes. That is, these publications believe housing problems are individual problems and have little to do with race, even when banks have admitted in court that race was part of their mortgage decision process. In Catherine Squires’ new study on the disproportionate impact of the subprime mortgage crisis on African Americans, she shows how mainstream rhetoric is rearticulated by even alternative media.

In her content analysis, Squires reveals that both BE and The Root presented stories in which responsibility for the mortgage crisis was shifted from the banks and lenders to the individual borrower. Colorlines was the only publication to address the unequal access of whites and people of color to the American Dream and home ownership, demonstrating greater resistance to the specious appeal of neoliberal rhetoric by placing greater onus on the government and the beneficiaries of its bailout (that is, the banks).

The expectation instilled in alternative media to present a different perspective endures. However, when even the stories they publish look like recycled versions of the mainstream, readers’ trust is sure to wane.

From the proposal to the honeymoon, American weddings have remained relatively unchanged for the better part of the last century. Even unconventional brides and grooms tend to follow a traditional script in planning their weddings; this is especially observable in the ubiquitous white dress/black suit combo. Recently, this gendered pattern has been complicated by the legalization of same-sex marriage in several states. Without the obligatory gender scripts, which traditions will gay men and lesbian women follow and which will they break?

In a recently published article, Katrina Kimport (Gender & Society, August 2012) takes a close look at the marital attire chosen by gay and lesbian couples by studying photographs of same-sex weddings in San Francisco in 2004. She finds that among the formally-dressed male couples, all of them conformed to gender norms–they were all dressed in suits or tuxes–while none conformed to the wedding norm of one bride and one groom. In other words, no men were dressed as brides. On the other hand, among the female couples, seven out of ten conformed to the wedding norm of one bride (in a wedding dress or other feminine wedding attire) and one groom (wearing some type of suit or tuxedo). Of the remaining female couples, half followed gender norms (two brides) and half did not (two grooms).

What might these trends mean for the future of wedding traditions? Might gay and lesbian marriages radically alter traditional heterosexual wedding norms? Or might some of their wedding day choices work to reinforce the gendered tradition of one bride and one groom? Such questions are not easily answered, but one thing is clear: same-sex marriage sweeps both gender norms and wedding norms off their feet.

A sticker photograph found via tumblr. Image uncredited.
A sticker photograph found via tumblr. Image uncredited. Click to enlarge.

For those who don’t enjoy dramatic irony, too many books and movies provoke that frustrated question: “Why didn’t they just talk to each other?” Entire plot lines that hinge on only a few words of missed dialogue have been the backbone of classic comedies and dramas for centuries, but now modern technology may be making this literary device just too… unbelievable.

Wellman and Rainie, writing in the first issue of the new journal Mobile Media & Communication, illustrate this shift with a creative new twist on an old classic. What if Romeo and Juliet, those unfortunate teens who just missed each other in the end, had cell phones? Instead of talking through their feuding families, they could have just texted, maybe avoiding (spoiler alert!) the whole suicide mess.

Using research from their book Networked: The New Social Operating System, the authors argue that mobile phones and other portable communication devices have ushered in an era of “networked individualism.” We connect as individuals and share everything, down to our geographical location. The star-crossed lovers couldn’t even dream of satellite technology, but they were still pioneering individual networking by meeting alone, in secret, instead of involving their families to court each other formally. Even a decade ago, you’d have to call your paramour’s “home phone,” and maybe even talk to their parents.

Today a quick text makes individual socializing that much easier and more efficient, but it may also radically shift communication through our closest social groups. In our social lives and our dramatic writing, how much longer will we be able to believe people just didn’t get the message?

Some call it “tough love,” others claim they’re just “keepin’ it real.” Either way, by preparing their children to face racism, parents hope their kids will be able to handle such realities in non-violent ways.

In their attempt to understand the impact of interpersonal racial discrimination on criminal offending, Callie Burt, Ronald Simmons, and Frederick Gibbons offer new insights into how African American parents prepare their children for experiences with racial bias in order to foster a sense of resilience.  Based on panel data from several hundred male African American youth from the Family and Community Health Study, their findings show that higher instances of racial discrimination increase the likelihood of crime. But they also find that families use what they call “ethnic-racial socialization” (ERS) as a means of reducing this effect. According to the authors, ERS is “a class of adaptive and protective practices utilized by racial/ethnic minority families to promote functioning in a society stratified by race and ethnicity.” ERS is not necessarily a strategic effort, but an adaptive means of coping with racial inequality. In addition to reducing the impact of racial discrimination among the sample of black youth, ERS also weakened the effects of emotional distress, hostile views, and disengagement from norms on increased offending. Further, teaching kids about racism may prevent them from getting tangled up in criminal responses, but it’s also clear evidence that our society hasn’t transcended race or racism.

In an era of concerted cultivation and enlightened parenting, the need to steer children away from crime by revealing harsh inequalities at a young age seems futile. Ethnic-racial socialization strategies are not compatible with most middle-class cultural scripts. However, the irony in all of this is that most privileged parents are keeping it just as “real” as low-income parents of color. It is the stark contrast in how these parents practice concerted cultivation—whether in teaching piano scales or teaching kids to expect a racist world—that catches our attention.