health/medicine: drugs

In the 1950s and ’60s, a set of social psychological experiments seemed to show that human beings were easily manipulated by low and moderate amounts of peer pressure, even to the point of violence. It was a stunning research program designed in response to the horrors of the Holocaust, which required the active participation of so many people, and the findings seemed to suggest that what happened there was part of human nature.

What we know now, though, is that this research was undertaken at an unusually conformist time. Mothers were teaching their children to be obedient, loyal, and to have good manners. Conformity was a virtue and people generally sought to blend in with their peers. It wouldn’t last.

At the same time as the conformity experiments were happening, something that would contribute to changing how Americans thought about conformity was being cooked up: the psychedelic drug, LSD.

Lysergic acid diethylamide was first synthesized in 1938 in the routine process of discovering new drugs for medical conditions. The first person to discover it psychedelic properties — its tendency to alter how we see and think — was the scientist who invented it, Albert Hoffmann. He ingested it accidentally, only to discover that it induces a “dreamlike state” in which he “perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.”

By the 1950s , LSD was being administered to unwitting American in a secret, experimental mind control program conducted by the United States Central Intelligence Agency, one that would last 14 years and occur in over 80 locations. Eventually the fact of the secret program would leak out to the public, and so would LSD.

It was the 1960s and America was going through a countercultural revolution. The Civil Rights movement was challenging persistent racial inequality, the women’s and gay liberation movements were staking claims on equality for women and sexual minorities, the sexual revolution said no to social rules surrounding sexuality and, in the second decade of an intractable war with Vietnam, Americans were losing patience with the government. Obedience had gone out of style.

LSD was the perfect drug for the era. For its proponents, there was something about the experience of being on the drug that made the whole concept of conformity seem absurd. A new breed of thinker, the “psychedelic philosopher,” argued that LSD opened one’s mind and immediately revealed the world as it was, not the world as human beings invented it. It revealed, in other words, the social constructedness of culture.

In this sense, wrote the science studies scholar Ido Hartogsohn, LSD was truly “countercultural,” not only “in the sense of being peripheral or opposed to mainstream culture [but in] rejecting the whole concept of culture.” Culture, the philosophers claimed, shut down our imagination and psychedelics were the cure. “Our normal word-conditioned consciousness,” wrote one proponent, “creates a universe of sharp distinctions, black and white, this and that, me and you and it.” But on acid, he explained, all of these rules fell away. We didn’t have to be trapped in a conformist bubble. We could be free.

The cultural influence of the psychedelic experience, in the context of radical social movements, is hard to overstate. It shaped the era’s music, art, and fashion. It gave us tie-dye, The Grateful Dead, and stuff like this:


via GIPHY

The idea that we shouldn’t be held down by cultural constrictions — that we should be able to live life as an individual as we choose — changed America.

By the 1980s, mothers were no longer teaching their children to be obedient, loyal, and to have good manners. Instead, they taught them independence and the importance of finding one’s own way. For decades now, children have been raised with slogans of individuality: “do what makes you happy,” “it doesn’t matter what other people think,” “believe in yourself,” “follow your dreams,” or the more up-to-date “you do you.”

Today, companies choose slogans that celebrate the individual, encouraging us to stand out from the crowd. In 2014, for example, Burger King abandoned its 40-year-old slogan, “Have it your way,” for a plainly individualistic one: “Be your way.” Across the consumer landscape, company slogans promise that buying their products will mark the consumer as special or unique. “Stay extraordinary,” says Coke; “Think different,” says Apple. Brands encourage people to buy their products in order to be themselves: Ray-Ban says “Never hide”; Express says “Express yourself,” and Reebok says “Let U.B.U.”

In surveys, Americans increasingly defend individuality. Millennials are twice as likely as Baby Boomers to agree with statements like “there is no right way to live.” They are half as likely to think that it’s important to teach children to obey, instead arguing that the most important thing a child can do is “think for him or herself.” Millennials are also more likely than any other living generation to consider themselves political independents and be unaffiliated with an organized religion, even if they believe in God. We say we value uniqueness and are critical of those who demand obedience to others’ visions or social norms.

Paradoxically, it’s now conformist to be an individualist and deviant to be conformist. So much so that a subculture emerged to promote blending in. “Normcore,” it makes opting into conformity a virtue. As one commentator described it, “Normcore finds liberation in being nothing special…”

Obviously LSD didn’t do this all by itself, but it was certainly in the right place at the right time. And as a symbol of the radical transition that began in the 1960s, there’s hardly one better.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Photo by Ted Eytan; flickr creative commons.

President Trump recently declared that Obamacare is “essentially dead” after the House of Representatives passed legislation to replace existing health care policy. While members of the Senate are uncertain about the future of the proposed American Health Care Act (AHCA) — which could ultimately result in as many as 24 million people losing their health insurance and those with pre-existing conditions facing increasing health coverage costs — a growing number of Americans, especially women, are sure that the legislation will be bad for their health, if enacted.

On the same day that the House passed the Republican-backed plan, for example, a friend of mine revealed on social media that she had gotten her yearly mammogram and physical examination. She posted that the preventative care did not cost anything under her current employer benefit plan, but would have been prohibitively expensive without insurance coverage, a problem faced by many women across the United States. For instance, the American Cancer Society reports that in 2013 38% of uninsured women had a mammogram in the last two years, while 70% of those with insurance did the same. These disparities are certainly alarming, but the problem is likely to worsen under the proposed AHCA.

Breast care screenings are currently protected under the Affordable Care Act’s Essential Health Benefits, which also covers birth control, as well as pregnancy, maternity, and newborn care. The proposed legislation supported by House Republicans and Donald Trump would allow individual states to eliminate or significantly reduce essential benefits for individuals seeking to purchase health insurance on the open market.

Furthermore, the current version of the AHCA would enable individual states to seek waivers, permitting insurance companies to charge higher premiums to people with pre-existing conditions, when they purchase policies on the open market. Making health insurance exorbitantly expensive could have devastating results for women, like those with a past breast cancer diagnosis, who are at risk of facing recurrence. Over 40,000 women already die each year from breast cancer in our country, with African-American women being disproportionately represented among these deaths.

Such disparities draw attention to the connection between inequality and health, patterns long documented by sociologists. Recent work by David R. Williams and his colleagues, for instance, examines how racism and class inequality help to explain why the breast cancer mortality rate in 2012 was 42% higher for Black women than for white women. Limiting affordable access to health care — which the AHCA would most surely do — would exacerbate these inequalities, and further jeopardize the health and lives of the most socially and economically vulnerable among us.

Certainly, everyone who must purchase insurance in the private market, particularly those with pre-existing conditions stand to lose under the AHCA. But women are especially at risk. Their voices have been largely excluded from discussion regarding health care reform, as demonstrated by the photograph of Donald Trump, surrounded by eight male staff members in January, signing the “global gag order,” which restricted women’s reproductive rights worldwide. Or as illustrated by the photo tweeted  by Vice-President Pence in March, showing him and the President, with over twenty male politicians, discussing possible changes to Essential Health Benefits, changes which could restrict birth control coverage, in addition to pregnancy, maternity, and newborn care. And now, as all 13 Senators slated to work on revisions to the AHCA are men.

Women cannot afford to be silent about this legislation. None of us can. The AHCA is bad for our health and lives.

Jacqueline Clark, PhD is an Associate Professor of Sociology and Chair of the Sociology and Anthropology Department at Ripon College. Her research interests include inequalities, the sociology of health and illness, and the sociology of jobs, work, and organizations.

1Botox has forever transformed the primordial battleground against aging. Since the FDA approved it for cosmetic use in 2002, eleven million Americans have used it. Over 90 percent of them are women.

In my forthcoming book, Botox Nation, I argue that one of the reasons Botox is so appealing to women is because the wrinkles that Botox is designed to “fix,” those disconcerting creases between our brows, are precisely those lines that we use to express negative emotions: angry, bitchy, irritated.  Botox is injected into the corrugator supercilii muscles, the facial muscles that allow us to pull our eyebrows together and push them down.  By paralyzing these muscles, Botox prevents this brow-lowering action, and in so doing, inhibits our ability to scowl, an expression we use to project to the world that we are aggravated or pissed off.

9781479825264_Full.jpg (200×300)

Sociologists have long speculated about the meaning of human faces for social interaction. In the 1950s, Erving Goffman developed the concept of facework to refer to the ways that human faces act as a template to invoke, process, and manage emotions. A core feature of our physical identity, our faces provide expressive information about our selves and how we want our identities to be perceived by others.

Given that our faces are mediums for processing and negotiating social interaction, it makes sense that Botox’s effect on facial expression would be particularly enticing to women, who from early childhood are taught to project cheerfulness and to disguise unhappiness. Male politicians and CEOs, for example, are expected to look pissed off, stern, and annoyed. However, when Hillary Clinton displays these same expressions, she is chastised for being unladylike, as undeserving of the male gaze, and criticized for disrupting the normative gender order. Women more so than men are penalized for looking speculative, judgmental, angry, or cross.

Nothing demonstrates this more than the recent viral pop-cultural idioms “resting bitch face.” For those unfamiliar with the not so subtly sexist phrase, “resting bitch face,” according to the popular site Urban Dictionary, is “a person, usually a girl, who naturally looks mean when her face is expressionless, without meaning to.” This same site defines its etymological predecessor, “bitchy resting face,” as “a bitchy alternative to the usual blank look most people have. This is a condition affecting the facial muscles, suffered by millions of women worldwide. People suffering from bitchy resting face (BRF) have the tendency look hostile and/or judgmental at rest.”

Resting bitch face and its linguistic cousin is nowhere near gender neutral. There is no name for men’s serious, pensive, and reserved expressions because we allow men these feelings. When a man looks severe, serious, or grumpy, we assume it is for good reason. But women are always expected to be smiling, aesthetically pleasing, and compliant. To do otherwise would be to fail to subordinate our own emotions to those of others, and this would upset the gendered status quo.

This is what the sociologist Arlie Russell Hochschild calls “emotion labor,” a type of impression management, which involves manipulating one’s feelings to transmit a certain impression. In her now-classic study on flight attendants, Hochschild documented how part of the occupational script was for flight attendants to create and maintain the façade of positive appearance, revealing the highly gendered ways we police social performance. The facework involved in projecting cheerfulness and always smiling requires energy and, as any woman is well aware, can become exhausting. Hochschild recognized this and saw emotion work as a form of exploitation that could lead to psychological distress. She also predicted that showing dissimilar emotions from those genuinely felt would lead to the alienation from one’s feelings.

Enter Botox—a product that can seemingly liberate the face from its resting bitch state, producing a flattening of affect where the act of appearing introspective, inquisitive, perplexed, contemplative, or pissed off can be effaced and prevented from leaving a lasting impression. One reason Botox may be especially appealing to women is that it can potentially relieve them from having to work so hard to police their expressions.

Even more insidiously, Botox may actually change how women feel. Scientists have long suggested that facial expressions, like frowning or smiling, can influence emotion by contributing to a range of bodily changes that in turn produce subjective feelings. This theory, known in psychology as the “facial feedback hypothesis,” proposes that expression intensifies emotion, whereas suppression softens it. It follows that blocking negative expressions with Botox injections should offer some protection against negative feelings. A study confirmed the hypothesis.

Taken together, this works point to some of the principal attractions of Botox for women. Functioning as an emotional lobotomy of sorts, Botox can emancipate women from having to vigilantly police their facial expressions and actually reduce the negative feelings that produce them, all while simultaneously offsetting the psychological distress of alienation.

Dana Berkowitz is a professor of sociology at Louisiana State University in Baton Rogue where she teaches about gender, sexuality, families, and qualitative methods. Her book, Botox Nation: Changing the Face of America, will be out in January and can be pre-ordered now.

Last week PBS hosted a powerful essay by law professor Ekow Yankah. He points to how the new opioid addiction crisis is being talked about very differently than addiction crises of the past. Today, he points out, addiction is being described and increasingly treated as a health crisis with a human toll. “Our nation has linked arms,” he says, “to save souls.”

Even just a decade ago, though, addicts weren’t victims, they were criminals.

What’s changed? Well, race. “Back then, when addiction was a black problem,” Yankah says about 30 years ago, “there was no wave of national compassion.” Instead, we were introduced to suffering “crack babies” and their inhuman, incorrigible mothers. We were told that crack and crime went hand-in-hand because the people involved were simply bad. We were told to fear addicts, not care for them. It was a “war on drugs” that was fought against the people who had succumbed to them.

Yankah is clear that this a welcome change. But, he says, for African Americans, who would have welcomed such compassion for the drugs that devastated their neighborhoods and families, it is bittersweet.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Flashback Friday.

My great-grandma would put a few drops of turpentine on a sugar cube as a cure-all for any type of cough or respiratory ailment. Nobody in the family ever had any obvious negative effects from it as far as I know. And once when I had a sinus infection my grandma suggested that I try gargling kerosene. I decided to go to the doctor for antibiotics instead, but most of my relatives thought that was a perfectly legitimate suggestion.

In the not-so-recent history, lots of substances we consider unhealthy today were marketed and sold for their supposed health benefits. Joe A. of Human Rights Watch sent in these images of vintage products that openly advertised that they contained cocaine or heroin. Perhaps you would like some Bayer Heroin?

Flickr Creative Commons, dog 97209

The Vapor-ol alcohol and opium concoction was for treating asthma:

Cocaine drops for the kids:

A reader named Louise sent in a recipe from her great-grandma’s cookbook. Her great-grandmother was a cook at a country house in England. The recipe is dated 1891 and calls for “tincture of opium”. The recipe (with original spellings):

Hethys recipe for cough mixture

1 pennyworth of each
Antimonial Wine
Acetic Acid
Tincture of opium
Oil of aniseed
Essence of peppermint
1/2lb best treacle

Well mix and make up to Pint with water.

As Joe says, it’s no secret that products with cocaine, marijuana, opium, and other now-banned substances were at one time sold openly, often as medicines. The changes in attitudes toward these products, from entirely acceptable and even beneficial to inherently harmful and addicting, is a great example of social construction. While certainly opium and cocaine have negative effects on some people, so do other substances that remained legal (or were re-legalized, in the case of alcohol).

Often racist and anti-immigrant sentiment played a role in changing views of what are now illegal controlled substances; for instance, the association of opium with Chinese immigrants contributed to increasingly negative attitudes toward it as anything associated with Chinese immigrants was stigmatized, particularly in the western U.S. This combined with a push by social reformers to prohibit a variety of substances, leading to the Harrison Narcotic Act. The act, passed in 1914, regulated production and distribution of opium but, in its application, eventually basically criminalized it.

Reformers pushing for cocaine to be banned suggested that its effects led Black men to rape White women, and that it gave them nearly super-human strength that allowed them to kill Whites more effectively. A similar argument was made about Mexicans and marijuana:

A Texas police captain summed up the problem: under marijuana, Mexicans became “very violent, especially when they become angry and will attack an officer even if a gun is drawn on him. They seem to have no fear, I have also noted that under the influence of this weed they have enormous strength and that it will take several men to handle one man while under ordinary circumstances one man could handle him with ease.”

So the story of the criminalization of some substances in the U.S. is inextricably tied to various waves of anti-immigrant and racist sentiment. Some of the same discourse–the “super criminal” who is impervious to pain and therefore especially violent and dangerous, the addicted mother who harms and even abandons her child to prostitute herself as a way to get drugs–resurfaced as crack cocaine emerged in the 1980s and was perceived as the drug of choice of African Americans.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

All politicians lie, said I.F. Stone. But they don’t all lie as blatantly as Chris Christie did last week in repeating his vow not to legalize marijuana in New Jersey.

Every bit of objective data we have tells us that it’s a gateway drug to other drugs.

That statement simply is not true. The evidence on marijuana as a gateway drug is at best mixed, as the governor or any journalist interested in fact checking his speech could have discovered by looking up “gateway” on Wikipedia.

If the governor meant that smoking marijuana in and of itself created a craving for stronger drugs, he’s just plain wrong. Mark Kleiman, a policy analyst who knows a lot about drugs, says bluntly:

The strong gateway model, which is that somehow marijuana causes fundamental changes in the brain and therefore people inevitably go on from marijuana to cocaine or heroin, is false, as shown by the fact that most people who smoke marijuana don’t. That’s easy. But of course nobody really believes the strong version.

Nobody? Prof. Kleiman, meet Gov. Christie

Or maybe Christie meant a softer version – that the kid who starts smoking weed gets used to doing illegal things, and he makes connections with the kinds of people who use stronger drugs. He gets drawn into their world. It’s not the weed itself that leads to cocaine or heroin, it’s the social world.

That social gateway version, though, offers support for legalization.  Legalization takes weed out of the drug underworld. If you want some weed, you no longer have to consort with criminals and serious druggies.

There are several other reasons to doubt the gateway idea. Much of the evidence comes from studies of individuals. But now, thanks to medical legalization, we also have state-level data, and the results are the same. Legalizing medical marijuana did not lead to an increase in the use of harder drugs, especially among kids. Just the opposite.


1a

First, note the small percents. Perhaps 1.6% of adults used cocaine in the pre-medical-pot years. That percent fell slightly post-legalization. Of course, those older people had long since passed through the gateway, so we wouldn’t expect legalization to make much difference for them. But for younger people, cocaine use was cut in half. Instead of an open gateway with traffic flowing rapidly from marijuana through to the world of hard drugs, it was more like, oh, I don’t know, maybe a bridge with several of its lanes closed clogging traffic.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The United States imprisons more people than any other country. This is true whether you measure by percentage of the population or by sheer, raw numbers. If the phrase mass incarceration applies anywhere, it applies in the good ol’ U. S. of A.

It wasn’t always this way. Rates of incarceration began rising as a result of President Reagan’s “war on drugs” in the ’80s (marijuana, for example), whereby the number of people imprisoned for non-violent crimes began climbing at an alarming rate. Today, about one-in-31 adults are in prison. his is a human rights crisis for the people that are incarcerated, but its impact also echoes through the job sector, communities, families, and the hearts of children. One-in-28 school-age children — 2.7 million — have a parent in prison.

2 (1)

In a new book, Children of the Prison Boom, sociologists Christopher Wildeman and Sara Wakefield describe the impact of parental imprisonment on children: an increase in poverty, homelessness, depression, anxiety, learning disorders, behavioral problems, and interpersonal aggression. Some argue that taking parents who have committed a crime out of the family might be good for children, but the data is in. It’s not.

Parental incarceration is now included in research on Adverse Childhood Experiences and it’s particular contours include shame and stigma alongside the trauma. It has become such a large problem that Sesame Street is incorporating in their Little Children, Big Challenges series and has a webpage devoted to the issue. Try not to cry as a cast member sings “you’re not alone” and children talk about what it feels like to have a parent in prison:

Wildeman and Wakefield, alongside another sociologist who researches the issue, Kristin Turney, are interviewed for a story about the problem at The Nation. They argue that even if we start to remedy mass incarceration — something we’re not doing — we will still have to deal with the consequences. They are, Wildeman and Wakefield say, “a lost generation now coming of age.”

The subtitle of their book, Mass Incarceration and the Future of Inequality, points to how that lost generation might exacerbate the already deep race and class differences in America. At The Nation, Katy Reckdahl writes:

One in four black children born in 1990 saw their father head off to prison before they turned 14… For white children of the same age, the risk is one in thirty. For black children whose fathers didn’t finish high school, the odds are even greater: more than 50 percent have dads who were locked up by the time they turned 14…

Even well-educated black families are disproportionately affected by the incarceration boom. Wakefield and Wildeman found that black children with college-educated fathers are twice as likely to see them incarcerated as the children of white high-school dropouts.

After the Emancipation Proclamation, Jim Crow hung like a weight around the shoulders of the parents of black and brown children. After Jim Crow, the GI Bill and residential redlining strangled their chances to build wealth that they could pass down. The mass incarceration boom is just another in a long history of state policies that target black and brown people — and their children — severely inhibiting their life chances.

Hat tip Citings and Sightings. Cross-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Pharmaceutical companies say that they need long patents that keep the price of their drugs high so that they can invest in research. But that’s not actually what they’re spending most of their money on. Instead, they’re spending more — sometimes twice as much — on advertising directly to doctors and consumers.

Data from the BBC, visualized by León Markovitz:

2“When do you cross the line from essential profits to profiteering?,” asked Dr Brian Druker, one of a group of physicians asking for price reductions.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.