There are those who contend that it does not benefit African Americans… to get them into the University of Texas where they do not do well, as opposed to having them go to a less­ advanced school… a slower-track school where they do well.

During oral arguments for Fisher v. University of Texas-Austin (in which the Supreme Court just upheld UT Austin’s use of race in their admissions policies), Justice Antonin Scalia’s comments caused quite an uproar. Did a member of the Supreme Court actually say that African Americans aren’t capable of success at competitive colleges? He was drawing from the so-called “mismatch hypothesis,” which suggests that affirmative action places people into positions they can’t handle—that is, that affirmative action could hurt African Americans by placing them in schools where they may not succeed or from which they may not graduate.

A significant amount of academic work debunks “mismatch theory,” deeming it both wrong and “paternalistic.”

Fischer and Massey use the National Longitudinal Survey of Freshman to analyze college outcomes and test the mismatch hypothesis; they find no evidence in its favor. Alon and Tienda use two different longitudinal datasets to run similar analyses, again finding no proof that ethnic minority students fare badly in advanced institutions. Replication results have been consistent over time; Kurlaender and Grodsky piece, for instance, find that students placed in programs considered “out of their league” performed just as well as those in less demanding programs.
In a twist, scholars find that affirmative action may place a different group of people in schools for which they are not equipped. In many schools, particularly prestigious ones, “legacy” students—whose family members graduated from the same school—benefit from affirmative action in admissions. Bowen and Bok show this has disproportionately affected white students, and Massey and Mooney show that legacy students earn lower grades than their peers and have lower graduation rates. If affirmative action is doing a disservice to some students, it is not in the way Justice Scalia suggested.
Photo by Helen Cassidy, Flickr. https://flic.kr/p/6mghmy
Photo by Helen Cassidy, Flickr. https://flic.kr/p/6mghmy

In case you missed it, new fossil evidence suggests that a creature known as the “Siberian Unicorn” may have lived alongside humans some 29,000 years ago. Perhaps that eccentric fellow you’ve seen in the aluminum-foil hat wasn’t so eccentric. In fact, research suggests an openness to phenomena like UFOs, unicorns, and elves is downright normal.

Consider how Scott Draper and Joseph O. Baker describe a wide variety of people across different religious subgroups who all believe in angels. Folklore-phenomena can provide people with emotional comfort and compelling stories.
Such narratives can be transposed across many belief systems and subcultures. Quite a few people believe chasing spirits is a spiritual experience, as discussed by Marc Eaton in his examination of ghost hunters and “paranormal investigators.” Other research looks at the popular pursuits of Bigfoot and alien crash sites.
Sociology has always shown how belief in the paranormal, the fantastical, or the spiritual is a social process (consider founding father Durkheim’s pivotal Elementary Forms of Religious Life). Influential scholars, such as Percy Cohen, who tackled the sociology of myth from a functionalist view, and Richard C. Crepeau, who describes how sport myths and “heroes” help sharpen a society’s moral and aesthetic values, show that the paranormal isn’t losing popularity.
"Drinking for Two" via Edmonton Fetal Alcohol Network
“Drinking for Two” via Edmonton Fetal Alcohol Network

Pregnant women are under attack—or so it seems. Actually, according to the Center for Disease Control (CDC), all women who might become pregnant ever are at risk. In February, the CDC released a report estimating that around 3 million women “are at risk of exposing their developing baby to alcohol because they are drinking, sexually active and not using birth control to prevent pregnancy.” Since then, many have bashed the CDC for advising women to live as though they are “pre-pregnant,” abstaining from drinking if they are not on birth control or if they are even considering getting pregnant. Coupled with growing threat of the Zika virus and its links to birth defects, such suggestions have propelled discussions of women’s roles in preventing catastrophic disability. Sociologists suggest that perceptions of women’s behavior are closely tied to ideas about the morality of motherhood. In particular, women who appear to resist common conceptions of what it means to be a “good” mother are subject to greater social control.

In American culture, motherhood is inextricably tied to morality. Moral arguments against abortion often rely on particular conceptions of sexual behavior, family life, and care for children. The ideology of “intensive mothering” demands that women be self-sacrificing and devote extensive time and energy to their children’s wants and needs — time and energy that many working women cannot afford.
This emphasis on mothers’ devotion to their children places them under considerable scrutiny, not only while raising children, but also during pregnancy. For instance, the “discovery” of Fetal Alcohol Syndrome heightened concerns over drinking during pregnancy. This made pregnant women the individual bearers of responsibility for the well-being of future children, and made them susceptible to moral outrage for behaviors like drinking. (Bucking the trend, the New York City Human Rights Commission has just recommended that visibly pregnant women cannot be discriminated against if, for instance, they order a glass of wine in a bar.)
Poor women, especially poor women of color, face a greater burden under dealized conceptions about what it means to be a “good” or “fit” mother. Not only are they regularly depicted as immoral or unfit, they are also criminalized and sanctioned at higher rates. Historical analyses show pregnant women are arrested for stillbirths, miscarriages, using drugs while pregnant, as well as incarcerated to prevent abortion. Poor women labeled “high risk” are prosecuted for failing to comply with medical advice when their fetus or baby dies, thus they are ironically discouraged from seeking care during pregnancy. Just as the “crack baby” became a symbol of the irresponsibility of poor, black women in the 1980s and ‘90s, Zika exposure and alcohol use are invoked today to place mothers and potential mothers under continued scrutiny.
Among many Minneapolis landmarks lit purple, the Lowry Bridge frames downtown on the night of Prince's death. Tony Webster, Flickr CC.
Among many Minneapolis landmarks lit purple, the Lowry Bridge frames downtown on the night of Prince’s death. Tony Webster, Flickr CC.

When music icon Prince died on April 21st, it affected millions of fans around the world. Famous and non-famous alike flooded social media, expressing their shock at the tragic loss of a superstar, while thousands gathered at the gates of Prince’s home and recording studio, Paisley Park, in suburban Minneapolis and in front of First Avenue in downtown, where many memorable scenes in “Purple Rain” were filmed. Some left purple flowers, letters, and stuffed animals, while others danced and sang. Similar worldwide rituals followed the passings of Michael Jackson, Whitney Houston, and David Bowie, despite most celebrants never having known them personally.

Death and loss are difficult experiences for the loved ones of the deceased. These losses may be compounded by ambiguous losses—those without closure—thought to delay the grieving process and strain the everyday lives of loved ones. Mourning, however, is not restricted to those we know personally. Masses mourned England’s Princess Diana, because they felt they knew her on a personal level, writing condolences such as “I feel as though I’ve lost a dear sister.” People also mourn the death of celebrities who hold connections to emotional events; that is, people do not solely grieve the loss of that celebrity, but also the loss of the memories associated with that celebrity.
Death and grief are private events as well as social rituals. Mass media and technology have helped increase such public mourning: many first hear about the death of celebrities via television, the Internet, and social media, and they often respond with online tributes. Not all celebrity deaths are equal, however: the extent to which the public mourns is an indication celebrity’s status.

 

Actress Kerry Washington portrays Anita Hill in an ad for "Confirmation."
Actress Kerry Washington portrays Anita Hill in an ad for “Confirmation.”

In April, HBO premiered “Confirmation,” the story of Supreme Court Justice Clarence Thomas’s 1991 confirmation hearings. In those hearings, a former colleague, lawyer Anita Hill, testified about the ongoing sexual harassment she endured while working for Thomas. HBO’s film, some 25 years after the hearings that Thomas famously called a “high-tech lynching,” reminds us of the murky waters women must drudge through when facing and reporting sexual harassment—as well as how complicated the intersections of race, gender, law, and work can be.

Hill testified that Thomas sexually harassed her as her supervisor at the Department of Education and the EEOC. Various studies find that at least 40% of all women report experiencing sexual harassment at work during some point of their lives. Women of color experience higher rates of both sexual and ethnic workplace harassment.
Hill testified that she continued working for Thomas despite the ongoing harassment because she had no other job alternatives. This is unsurprising given that women in law professions encounter a glass ceiling that limits upward mobility, often pushing women to pursue a limited track of jobs when seeking promotions. Further, women in law professions report hearing sexist jokes, having their authority questioned, and being complimented on looks rather than achievements—all at higher rates than their male colleagues.
Even women in power are subject to sexual harassment. One study finds that sexual harassment can actually increase when some women occupy supervisory positions. Sexual harassment has much more to do with power than simple workplace hierarchies.
An officer wears a body camera in North Charleston, NC. Photo by Ryan Johnson, Flickr CC.
An officer wears a body camera in North Charleston, NC. Photo by Ryan Johnson, Flickr CC.

The issue of police brutality has long been a problem in U.S. criminal justice. Police-worn body cameras are one potential “remedy” to these violent encounters, but they have both benefits and drawbacks.

The cameras may increase transparency and improve police legitimacy, promote legally compliant behavior among both police officers and citizens, enhance evidence quality that can improve resulting legal proceedings, and deter officers’ use-of-force. Conversely, body-worn cameras could create privacy concerns for the officer and the citizenry and place a large logistical and financial burden on already cash-strapped law enforcement agencies.
This issue is so timely that research is only now starting to see publication, but we do have some early insights. The first observational studies examining the use of police-worn body cameras were carried out in England and Scotland. They found rates of citizen complaints dropped after body cameras were introduced. Preliminary results from an experimental study in Phoenix, Arizona also suggest that the use of body cameras reduces both self-reported and official records of citizen complaints.
The first experimental evidence concerning use-of-force comes from a large study in the Rialto, California Police Department, and the results should encourage advocates of body cameras. The study randomly assigned particular police shifts to wear body cameras (the “treatment”). Police shifts in the treatment condition are associated with reduced use-of-force: shifts in the control condition saw roughly twice as much use-of-force as the treatment condition and citizen complaints against the police were significantly reduced in the treatment condition.
Graphic via Washington Post. Click for original and animation.
Graphic via Washington Post. Click for original and animation.

The Washington Post highlights the growing morbidity and mortality rates of rural white women. The rates of sickness and death for white women have climbed steadily over the past couple of decades, but the most dramatic increase is in rural areas. Sociologists and demographers have long investigated these trends. Poverty, stress, and timing of childbirth all matter for mortality, but the combination of these factors have stronger effects on rural, white women—surprising, because poverty confounds our typical understandings of race and inequality.

Mortality rates have decreased overall since the latter half of the 20th century, though several factors, many related to poverty and education, contribute to the increasing death rates of certain groups. Those with less education tend to have higher mortality rates and rates of heart disease and lung cancer.
Less education tends to correlate with lower socioeconomic status and difficulty finding employment. Sociologists Link and Phelan point to poverty as a “fundamental cause” of mortality and morbidity. Low socioeconomic status means difficulty is accessing resources: not only do poor people have trouble obtaining the means to maintain a healthy life, they also tend to lack the time, transportation, social networks, and money to help them recover from sickness.
Some of the health issues tied to poverty affect women more than men. Women with high stress levels are more likely than men to die from cancer-related illnesses. Other health patterns related to social class, such as the timing of childbirth, matter, too. Poorer women are more likely to have children before age 20, which correlates with increased risk of death, heart and lung disease, and cancer.
Vintage postcard via Blue Mountains Library, Flickr CC.
Vintage postcard via Blue Mountains Library, Flickr CC.

This is the time of year that many people throw open their windows and begin their yearly spring cleaning. Long ago, springtime cleaning had religious significance and coincided with holidays such as Passover and Easter. By the 19th century, spring cleaning had become more about practicality than piety. Particularly in places that suffered cold, wet winters, March and April were a perfect time for dusting because it was warm enough to open windows, but still too chilly for bugs to fly in the house. Ideally, the wind would help blow the dust out of the home instead of swirling it around the rooms.

The blame for a dusty shelf tends to fall on women’s shoulders because the home has traditionally been “her place” in society. Although the 1950s vision of June Cleaver has shifted and more women now participate in the labor force, women still tend to take on the bulk of the housework. Women employed outside the home have a “second shift” of cooking, cleaning, and childcare when they come home from work.
Women who work in more masculinized jobs tend to do more cooking and cleaning, and men with feminized professions engage in more “manly” tasks like yard work and auto repair to neutralize their gender-atypical occupations. Even in couples that are not comprised of a cis-man and a cis-woman, the gendered division of household labor persists. In couples consisting of trans*-men and cis-women, the women end up taking on the “Cinderella roles,” which they often link to personal preference rather than socialization or gender roles.
And what of the sociological significance of dust? A dusty book can show a lack of interest in the material, and the old adage “cleanliness is next to godliness” speaks to the moral implications of a dust-free, spotless home. Dust and dirt are out of place in the well-tended home, and their presence highlights a lack of control over the environment. Additionally, a lack of cleanliness has long served as a social indicator of moral disorder in Western Culture, acting as rallying point of social solidarity over what is socially acceptable.
Urban Seed, an Australian organization, considers harm reduction programs part of their mission to help disadvantaged communities. Flickr CC.
Urban Seed, an Australian organization, considers harm reduction programs part of their mission to help disadvantaged communities. Flickr CC.

The mayor of Ithaca, New York recently proposed a facility for people to use heroin and other injected drugs safely. It’s part of a larger plan to focus on prevention and treatment of drug use, and the facility’s trained medical staff would provide clean needles, referrals to treatment programs, and naloxone, an opioid overdose antidote. Today’s opioid epidemic—which kills an estimated 78 Americans every day—has shocked many, given that other forms of illicit drug use have generally declined in prevalence and mortality during recent decades. Ithaca’s plan falls under the umbrella of “harm reduction” approaches, which attempt to mitigate personal and societal harm from drug and alcohol use. Social science shows us how and why these programs work.

Supervised injection facilities are relatively recent, originating in the Dutch and Swiss harm reduction movements of the 1970s and ‘80s. The first site in North America opened in Vancouver in 2003 and is linked to drastic declines in public injection and overdose deaths. Today a number of supervised drug consumption rooms operate throughout northern Europe, Canada, and Australia. Ithaca’s would be the first in the U.S.
Substance use was once a popular element of social events, like election day, but by the 20th century, “drug scares” stigmatized drug use, associating it with racial stereotypes, immigration, and crime. Smoking opium was first outlawed in the U.S. in the 1870s, for instance, as a result of anti-Chinese sentiments in California. Non-smoking opioid use remained popular among the white middle class—for supposed medical reasons, but by the turn of the century though, users who preferred injection became the stigmatized face of opiate addiction.
Stigma remains a critical issue in drug treatment, preventing users from accessing clean injection tools, uncontaminated opiates, information about safe injection practices, and life-saving overdose antidotes. Harm reduction efforts, like needle exchanges, have the potential to restore self-respect and autonomy to populations generally believed to lack these characteristics. Programs that provide work to formerly incarcerated individuals who have undergone drug treatment has been shown to reduce certain crimes, like robberies. Harm reduction communities also offer a space for drug users to empathize with and support each other, creating networks that bolster success.
Zoe Saldana, left, and Nine Simone, right. Image via ABC News Entertainment.
Zoe Saldana, left, and Nine Simone, right. Image via ABC News Entertainment.

Zoe Saldana’s portrayal of singer and activist Nina Simone in an upcoming biopic has proven controversial, even before the film’s premiere. In press photos, Saldana, a light-skinned woman of color, is clearly wearing dark makeup and a prosthetic nose to appear more like the late singer. Some argue using “blackface” in order to cast Saldana is particularly troubling considering Nina Simone’s own life-long dedication to encouraging the acceptance and embrace of dark skin tones. It also ignores the realities of colorism, which reproduces social inequalities and hierarchies among people of color.

Several studies address the benefits that accrue to light-skinned women. Employers, for example, often evaluate women applicants on physical attractiveness, regardless of job skills. This includes privileging physical features that suggest lighter-skinned women are friendlier and more intelligent. Lighter skin tones also make their female bearers more likely to marry spouses with higher incomes, report less perceived job discrimination, and earn a higher income. In schools, studies find that teachers expect their lighter-skinned students to display better behavior and higher intelligence than their darker peers, and public health research shows lower rates of mental and physical health problems among lighter-skinned blacks.
Colorism may provide socioeconomic, educational, and health benefits to light-skinned women, but it also challenges their identity as black women. Other blacks may perceive them as not “black enough,” assuming that they are more assimilated into white culture and lack awareness of black struggles. Those with lighter skin may feel isolated as members of their ethic group openly question their authenticity and belonging.