health/medicine

Sociologists spend a lot of time thinking about lives in social context: how the relationships and communities we live in shape the way we understand ourselves and move through the world. It can be tricky to start thinking about this, but one easy way to do it is to start collecting social facts. Start by asking, what’s weird about where you’re from?

I grew up on the western side of the Lower Peninsula of Michigan, so my eye naturally drifts to the Great Lakes every time I look at a map of the US. Lately I’ve been picking up on some interesting things I never knew about my old home state. First off, I didn’t realize that, relative to the rest of the country, this region is a hotspot for air pollution from Chicago and surrounding industrial areas.

Second, I was looking at ProPublica’s reporting of a new database of Catholic clergy credibly accused of abuse, and noticed that the two dioceses covering western MI haven’t yet disclosed information about possible accusations. I didn’t grow up Catholic, but as a sociologist who studies religion it is weird to think about the institutional factors that might be keeping this information under wraps.

Third, there’s the general impact of this region on the political and cultural history of the moment. West Michigan happens to be the place that brought you some heavy hitters like Amway (which plays a role in one of my favorite sociological podcasts of last year), the founder of Academi (formally known as Blackwater), and our current Secretary of Education. In terms of elite political and economic networks, few regions have been as influential in current Republican party politics.

I think about these facts and wonder how much they shaped my own story. Would I have learned to like exercise more if I could have actually caught my breath during the mile run in gym class? Did I get into studying politics and religion because it was baked into all the institutions around me, even the business ventures? It’s hard to say for sure.

What’s weird about where YOU’RE from? Doing this exercise is great for two reasons. First, it helps to get students thinking in terms of the sociological imagination — connecting bigger social and historical factors to their individual experiences. Second, it also helps to highlight an important social research methods point about the ecological fallacy by getting us to think about all the ways that history and social context don’t necessarily force us to turn out a certain way. As more data become public and maps get easier to make, it is important to remember that population correlates with everything!

Evan Stewart is an assistant professor of sociology at University of Massachusetts Boston. You can follow his work at his website, or on BlueSky.

My gut reaction was that nobody is actually eating the freaking Tide Pods.

Despite the explosion of jokes—yes, mostly just jokes—about eating detergent packets, sociologists have long known about media-fueled cultural panics about problems that aren’t actually problems. Joel Best’s groundbreaking research on these cases is a classic example. Check out these short video interviews with Best on kids not really being poisoned by Halloween candy and the panic over “sex bracelets.”

[youtube]https://www.youtube.com/watch?v=Bav01pAZrNw[/youtube]

In a tainted Halloween candy study, Best and Horiuchi followed up on media accounts to confirm cases of actual poisoning or serious injury, and they found many cases couldn’t be confirmed or were greatly exaggerated. So, I followed the data on detergent digestion.

It turns out, there is a small trend. According to a report from the American Association of Poison Control Centers,

…in 2016 and 2017, poison control centers handled thirty-nine and fifty-three cases of intentional exposures, respectively, among thirteen to nineteen year olds. In the first fifteen days of 2018 alone, centers have already handled thirty-nine such intentional cases among the same age demographic.

That said, this trend is only relative to previous years and cannot predict future incidents. The life cycle of internet jokes is fairly short, rotating quickly with an eye toward the attention economy. It wouldn’t be too surprising if people moved on from the pods long before the panic dies out.Evan Stewart is an assistant professor of sociology at University of Massachusetts Boston. You can follow his work at his website, or on BlueSky.

The CDC recently issued a press release announcing that rates of reported cases for sexually transmitted diseases are setting record highs. The new report offers reports of rates going back to 1941 in a table, so I made a quick chart to see the pattern in context and compare the more common conditions over time (HIV wasn’t included in this particular report).

It is important to note that a big part of changes in disease rates is usually detection. Once you start looking for a condition, you’ll probably find more of it until enough diagnoses happen for treatment to bring the rates down. Up until 2000, the U.S. did pretty well in terms of declining rates for cases of gonorrhea and syphilis. Zoom in on the shaded area from 2000 to 2016, however, and you can see a pretty different story. These rates are up over the last 16 years, and chlamydia rates have been steadily increasing since the start of reporting in 1984.

STDs are fundamentally a social phenomenon, especially because they can spread through social networks. However, we have to be very careful not to jump to conclusions about the causes of these trends. It’s tempting to blame dating apps or hookup culture, for example, but early work at the state level only finds a mixed relationship between dating app use and STD rates and young people also have higher rates of sexual inactivity. Rate increases could even be due in part to detection now that more people have access to health coverage and care through the Affordable Care Act. Just don’t wait for peer review to finish before going to get tested!

Inspired by demographic facts you should know cold, “What’s Trending?” is a post series at Sociological Images featuring quick looks at what’s up, what’s down, and what sociologists have to say about it.Evan Stewart is an assistant professor of sociology at University of Massachusetts Boston. You can follow his work at his website, or on BlueSky.

In the 1950s and ’60s, a set of social psychological experiments seemed to show that human beings were easily manipulated by low and moderate amounts of peer pressure, even to the point of violence. It was a stunning research program designed in response to the horrors of the Holocaust, which required the active participation of so many people, and the findings seemed to suggest that what happened there was part of human nature.

What we know now, though, is that this research was undertaken at an unusually conformist time. Mothers were teaching their children to be obedient, loyal, and to have good manners. Conformity was a virtue and people generally sought to blend in with their peers. It wouldn’t last.

At the same time as the conformity experiments were happening, something that would contribute to changing how Americans thought about conformity was being cooked up: the psychedelic drug, LSD.

Lysergic acid diethylamide was first synthesized in 1938 in the routine process of discovering new drugs for medical conditions. The first person to discover it psychedelic properties — its tendency to alter how we see and think — was the scientist who invented it, Albert Hoffmann. He ingested it accidentally, only to discover that it induces a “dreamlike state” in which he “perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.”

By the 1950s , LSD was being administered to unwitting American in a secret, experimental mind control program conducted by the United States Central Intelligence Agency, one that would last 14 years and occur in over 80 locations. Eventually the fact of the secret program would leak out to the public, and so would LSD.

It was the 1960s and America was going through a countercultural revolution. The Civil Rights movement was challenging persistent racial inequality, the women’s and gay liberation movements were staking claims on equality for women and sexual minorities, the sexual revolution said no to social rules surrounding sexuality and, in the second decade of an intractable war with Vietnam, Americans were losing patience with the government. Obedience had gone out of style.

LSD was the perfect drug for the era. For its proponents, there was something about the experience of being on the drug that made the whole concept of conformity seem absurd. A new breed of thinker, the “psychedelic philosopher,” argued that LSD opened one’s mind and immediately revealed the world as it was, not the world as human beings invented it. It revealed, in other words, the social constructedness of culture.

In this sense, wrote the science studies scholar Ido Hartogsohn, LSD was truly “countercultural,” not only “in the sense of being peripheral or opposed to mainstream culture [but in] rejecting the whole concept of culture.” Culture, the philosophers claimed, shut down our imagination and psychedelics were the cure. “Our normal word-conditioned consciousness,” wrote one proponent, “creates a universe of sharp distinctions, black and white, this and that, me and you and it.” But on acid, he explained, all of these rules fell away. We didn’t have to be trapped in a conformist bubble. We could be free.

The cultural influence of the psychedelic experience, in the context of radical social movements, is hard to overstate. It shaped the era’s music, art, and fashion. It gave us tie-dye, The Grateful Dead, and stuff like this:


via GIPHY

The idea that we shouldn’t be held down by cultural constrictions — that we should be able to live life as an individual as we choose — changed America.

By the 1980s, mothers were no longer teaching their children to be obedient, loyal, and to have good manners. Instead, they taught them independence and the importance of finding one’s own way. For decades now, children have been raised with slogans of individuality: “do what makes you happy,” “it doesn’t matter what other people think,” “believe in yourself,” “follow your dreams,” or the more up-to-date “you do you.”

Today, companies choose slogans that celebrate the individual, encouraging us to stand out from the crowd. In 2014, for example, Burger King abandoned its 40-year-old slogan, “Have it your way,” for a plainly individualistic one: “Be your way.” Across the consumer landscape, company slogans promise that buying their products will mark the consumer as special or unique. “Stay extraordinary,” says Coke; “Think different,” says Apple. Brands encourage people to buy their products in order to be themselves: Ray-Ban says “Never hide”; Express says “Express yourself,” and Reebok says “Let U.B.U.”

In surveys, Americans increasingly defend individuality. Millennials are twice as likely as Baby Boomers to agree with statements like “there is no right way to live.” They are half as likely to think that it’s important to teach children to obey, instead arguing that the most important thing a child can do is “think for him or herself.” Millennials are also more likely than any other living generation to consider themselves political independents and be unaffiliated with an organized religion, even if they believe in God. We say we value uniqueness and are critical of those who demand obedience to others’ visions or social norms.

Paradoxically, it’s now conformist to be an individualist and deviant to be conformist. So much so that a subculture emerged to promote blending in. “Normcore,” it makes opting into conformity a virtue. As one commentator described it, “Normcore finds liberation in being nothing special…”

Obviously LSD didn’t do this all by itself, but it was certainly in the right place at the right time. And as a symbol of the radical transition that began in the 1960s, there’s hardly one better.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Flashback Friday. 

Responding to critics who argue that poor people do not choose to eat healthy food because they’re ignorant or prefer unhealthy food, dietitian Ellyn Satter wrote a hierarchy of food needs. Based on Maslow’s hierarchy of needs, it illustrates Satter’s ideas as to the elements of food that matter first, second, and so on… starting at the bottom.

The graphic suggests that getting enough food to eat is the most important thing to people. Having food be acceptable (e.g., not rotten, something you are not allergic to) comes second. Once those two things are in place, people hope for reliable access to food and only then do they begin to worry about taste. If people have enough, acceptable, reliable, good-tasting food, then they seek out novel food experiences and begin to make choices as to what to eat for instrumental purposes (e.g., number of calories, nutritional balance).

As Michelle at The Fat Nutritionist writes, sometimes when a person chooses to eat nutritionally deficient or fattening foods, it is not because they are “stupid, ignorant, lazy, or just a bad, bad person who loves bad, bad food.”  Sometimes, it’s “because other needs come first.”

Originally posted in 2010; hat tip to Racialicious; cross-posted at Jezebel.Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Flashback Friday.

Monica C. sent along images of a pamphlet, from 1920, warning soldiers of the dangers of sexually transmitted infections (STIs). In the lower right hand corner (close up below), the text warns that “most” “prostitutes (whores) and easy women” “are diseased.” In contrast, in the upper left corner, we see imagery of the pure woman that a man’s good behavior is designed to protect (also below).  “For the sake of your family,” it reads, “learn the truth about venereal diseases.”

The contrast, between those women who give men STIs (prostitutes and easy women) and those who receive them from men (wives) is a reproduction of the virgin/whore dichotomy (women come in only two kinds: good, pure, and worthy of respect and bad, dirty, and deserving of abuse).  It also does a great job of making invisible the fact that women with an STI likely got it from a man and women who have an STI, regardless of how they got one, can give it away.  The men’s role in all this, that is, is erased in favor of demonizing “bad” girls.

See also these great examples of the demonization of the “good time Charlotte” during World War II (skull faces and all) and follow this post to a 1917 film urging Canadian soldiers to refrain from sex with prostitutes (no antibiotics back then, you know).

This post was originally shared in August 2010.Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

1Botox has forever transformed the primordial battleground against aging. Since the FDA approved it for cosmetic use in 2002, eleven million Americans have used it. Over 90 percent of them are women.

In my forthcoming book, Botox Nation, I argue that one of the reasons Botox is so appealing to women is because the wrinkles that Botox is designed to “fix,” those disconcerting creases between our brows, are precisely those lines that we use to express negative emotions: angry, bitchy, irritated.  Botox is injected into the corrugator supercilii muscles, the facial muscles that allow us to pull our eyebrows together and push them down.  By paralyzing these muscles, Botox prevents this brow-lowering action, and in so doing, inhibits our ability to scowl, an expression we use to project to the world that we are aggravated or pissed off.

9781479825264_Full.jpg (200×300)

Sociologists have long speculated about the meaning of human faces for social interaction. In the 1950s, Erving Goffman developed the concept of facework to refer to the ways that human faces act as a template to invoke, process, and manage emotions. A core feature of our physical identity, our faces provide expressive information about our selves and how we want our identities to be perceived by others.

Given that our faces are mediums for processing and negotiating social interaction, it makes sense that Botox’s effect on facial expression would be particularly enticing to women, who from early childhood are taught to project cheerfulness and to disguise unhappiness. Male politicians and CEOs, for example, are expected to look pissed off, stern, and annoyed. However, when Hillary Clinton displays these same expressions, she is chastised for being unladylike, as undeserving of the male gaze, and criticized for disrupting the normative gender order. Women more so than men are penalized for looking speculative, judgmental, angry, or cross.

Nothing demonstrates this more than the recent viral pop-cultural idioms “resting bitch face.” For those unfamiliar with the not so subtly sexist phrase, “resting bitch face,” according to the popular site Urban Dictionary, is “a person, usually a girl, who naturally looks mean when her face is expressionless, without meaning to.” This same site defines its etymological predecessor, “bitchy resting face,” as “a bitchy alternative to the usual blank look most people have. This is a condition affecting the facial muscles, suffered by millions of women worldwide. People suffering from bitchy resting face (BRF) have the tendency look hostile and/or judgmental at rest.”

Resting bitch face and its linguistic cousin is nowhere near gender neutral. There is no name for men’s serious, pensive, and reserved expressions because we allow men these feelings. When a man looks severe, serious, or grumpy, we assume it is for good reason. But women are always expected to be smiling, aesthetically pleasing, and compliant. To do otherwise would be to fail to subordinate our own emotions to those of others, and this would upset the gendered status quo.

This is what the sociologist Arlie Russell Hochschild calls “emotion labor,” a type of impression management, which involves manipulating one’s feelings to transmit a certain impression. In her now-classic study on flight attendants, Hochschild documented how part of the occupational script was for flight attendants to create and maintain the façade of positive appearance, revealing the highly gendered ways we police social performance. The facework involved in projecting cheerfulness and always smiling requires energy and, as any woman is well aware, can become exhausting. Hochschild recognized this and saw emotion work as a form of exploitation that could lead to psychological distress. She also predicted that showing dissimilar emotions from those genuinely felt would lead to the alienation from one’s feelings.

Enter Botox—a product that can seemingly liberate the face from its resting bitch state, producing a flattening of affect where the act of appearing introspective, inquisitive, perplexed, contemplative, or pissed off can be effaced and prevented from leaving a lasting impression. One reason Botox may be especially appealing to women is that it can potentially relieve them from having to work so hard to police their expressions.

Even more insidiously, Botox may actually change how women feel. Scientists have long suggested that facial expressions, like frowning or smiling, can influence emotion by contributing to a range of bodily changes that in turn produce subjective feelings. This theory, known in psychology as the “facial feedback hypothesis,” proposes that expression intensifies emotion, whereas suppression softens it. It follows that blocking negative expressions with Botox injections should offer some protection against negative feelings. A study confirmed the hypothesis.

Taken together, this works point to some of the principal attractions of Botox for women. Functioning as an emotional lobotomy of sorts, Botox can emancipate women from having to vigilantly police their facial expressions and actually reduce the negative feelings that produce them, all while simultaneously offsetting the psychological distress of alienation.

Dana Berkowitz is a professor of sociology at Louisiana State University in Baton Rogue where she teaches about gender, sexuality, families, and qualitative methods. Her book, Botox Nation: Changing the Face of America, will be out in January and can be pre-ordered now.

In 1985, Zeneca Pharmaceuticals (now AstraZeneca) declared October “National Breast Cancer Awareness Month.” Their original campaign promoted mammography screenings and self-breast exams, as well as aided fundraising efforts for breast cancer related research.  The month continues with the same goals, and is still supported by AstraZeneca, in addition to many other organizations, most notably the American Cancer Society.

The now ubiquitous pink ribbons were pinned onto the cause, when the Susan G. Komen Breast Cancer Foundation distributed them at a New York City fundraising event in 1991. The following year, 1.5 million Estée Lauder  cosmetic customers received the promotional reminder, along with an informational card about breast self-exams. Although now a well-known symbol, the ribbons elide a less well-known history of Breast Cancer Awareness co-opting grassroots’ organizing and activism targeting women’s health and breast cancer prevention.

The “awareness” campaign also opened the floodgates for other companies to capitalize on the disease. For example, Avon, New Balance, and Yoplait have sold jewelry, athletic shoes, and yogurt, respectively, using the pink ribbon as a logo, while KitchenAid still markets a product line called “Cook for the Cure” that includes pink stand mixers, food processors, and cooking accessories, items which the company first started selling in 2001.  Not to be left out, Smith and Wesson, Taurus, Federal, and Bersa, among other companies, have sold firearms with pink grips and/or finishing, pink gun-cases, and even pink ammo with the pink ribbon symbol emblazoned on the packaging. Because breast cancer can be promoted in corporate-friendly ways and lacks the stigma associated with other diseases, like HIV/AIDS, these companies and others, have been willing to endorse Breast Cancer Awareness Month and, in some cases, donate proceeds from their merchandise to support research affiliated with the disease.

2 (1)

Yet companies’ willingness to profit from the cause has also served to commodify breast cancer, and to support what sociologist Gayle Sulik calls “pink ribbon culture.” As Sulik notes, marking breast cancer with the color pink not only feminizes the disease, but also reinforces gendered expectations about how women are “supposed” to react to and cope with the illness, claims also corroborated by my own research on breast cancer support groups.

Based on participant observation of four support groups and in-depth interviews with participants, I have documented how breast cancer patients are expected to present a feminine self, and to also be positive and upbeat, despite the pain and suffering they endure as a result of being ill. The women in the study, for example, spent considerable time and attention on their physical appearance, working to present a traditionally feminine self, even while recovering from surgical procedures and debilitating therapies, such as chemotherapy and radiation. Similarly, members of the groups frequently joked about their bodies, especially in sexualized ways, making light of the physical disfigurement resulting from their disease. Like the compensatory femininity in which they engaged, laughing about their plight seemed to assuage some of the emotional pain that they experienced.  However, the coping strategies reinforced traditional standards of beauty and also prevented members of the groups from expressing anger or bitterness, feelings that would have been justifiable, but seen as (largely) culturally inappropriate because they were women.

Even when they recovered physically from the disease, the women were not immune to the effects of the “pink ribbon culture,” as other work from the study demonstrates. Many group participants, for instance, reported that friends and family were often less than sympathetic when they expressed uncertainty about the future and/or discontent about what they had been through.  As “survivors,” they were expected to be strong, positive, and upbeat, not fearful or anxious, or too willing to complain about the aftermath of their disease. The women thus learned to cover their uncomfortable emotions with a veneer of strength and courage. This too helps to illustrate how the “pink ribbon culture,” which celebrates survivors and survivorhood, limits the range of emotions that women who have had breast cancer are able to express. It also demonstrates how the myopic focus on survivors detracts attention from the over 40,000 women who die from breast cancer each year in the United States, as well as from the environmental causes of the disease.

Such findings should give pause. If October is truly a time to bring awareness to breast cancer and the women affected by it, we need to acknowledge the pain and suffering associated with the disease and resist the “pink ribbon culture” that contributes to it.

Jacqueline Clark, PhD is an Associate Professor of Sociology and Chair of the Sociology and Anthropology Department at Ripon College. Her research focuses on inequalities, the sociology of health and illness, and the sociology of jobs, work, and organizations.