history

In the 1950s and ’60s, a set of social psychological experiments seemed to show that human beings were easily manipulated by low and moderate amounts of peer pressure, even to the point of violence. It was a stunning research program designed in response to the horrors of the Holocaust, which required the active participation of so many people, and the findings seemed to suggest that what happened there was part of human nature.

What we know now, though, is that this research was undertaken at an unusually conformist time. Mothers were teaching their children to be obedient, loyal, and to have good manners. Conformity was a virtue and people generally sought to blend in with their peers. It wouldn’t last.

At the same time as the conformity experiments were happening, something that would contribute to changing how Americans thought about conformity was being cooked up: the psychedelic drug, LSD.

Lysergic acid diethylamide was first synthesized in 1938 in the routine process of discovering new drugs for medical conditions. The first person to discover it psychedelic properties — its tendency to alter how we see and think — was the scientist who invented it, Albert Hoffmann. He ingested it accidentally, only to discover that it induces a “dreamlike state” in which he “perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.”

By the 1950s , LSD was being administered to unwitting American in a secret, experimental mind control program conducted by the United States Central Intelligence Agency, one that would last 14 years and occur in over 80 locations. Eventually the fact of the secret program would leak out to the public, and so would LSD.

It was the 1960s and America was going through a countercultural revolution. The Civil Rights movement was challenging persistent racial inequality, the women’s and gay liberation movements were staking claims on equality for women and sexual minorities, the sexual revolution said no to social rules surrounding sexuality and, in the second decade of an intractable war with Vietnam, Americans were losing patience with the government. Obedience had gone out of style.

LSD was the perfect drug for the era. For its proponents, there was something about the experience of being on the drug that made the whole concept of conformity seem absurd. A new breed of thinker, the “psychedelic philosopher,” argued that LSD opened one’s mind and immediately revealed the world as it was, not the world as human beings invented it. It revealed, in other words, the social constructedness of culture.

In this sense, wrote the science studies scholar Ido Hartogsohn, LSD was truly “countercultural,” not only “in the sense of being peripheral or opposed to mainstream culture [but in] rejecting the whole concept of culture.” Culture, the philosophers claimed, shut down our imagination and psychedelics were the cure. “Our normal word-conditioned consciousness,” wrote one proponent, “creates a universe of sharp distinctions, black and white, this and that, me and you and it.” But on acid, he explained, all of these rules fell away. We didn’t have to be trapped in a conformist bubble. We could be free.

The cultural influence of the psychedelic experience, in the context of radical social movements, is hard to overstate. It shaped the era’s music, art, and fashion. It gave us tie-dye, The Grateful Dead, and stuff like this:


via GIPHY

The idea that we shouldn’t be held down by cultural constrictions — that we should be able to live life as an individual as we choose — changed America.

By the 1980s, mothers were no longer teaching their children to be obedient, loyal, and to have good manners. Instead, they taught them independence and the importance of finding one’s own way. For decades now, children have been raised with slogans of individuality: “do what makes you happy,” “it doesn’t matter what other people think,” “believe in yourself,” “follow your dreams,” or the more up-to-date “you do you.”

Today, companies choose slogans that celebrate the individual, encouraging us to stand out from the crowd. In 2014, for example, Burger King abandoned its 40-year-old slogan, “Have it your way,” for a plainly individualistic one: “Be your way.” Across the consumer landscape, company slogans promise that buying their products will mark the consumer as special or unique. “Stay extraordinary,” says Coke; “Think different,” says Apple. Brands encourage people to buy their products in order to be themselves: Ray-Ban says “Never hide”; Express says “Express yourself,” and Reebok says “Let U.B.U.”

In surveys, Americans increasingly defend individuality. Millennials are twice as likely as Baby Boomers to agree with statements like “there is no right way to live.” They are half as likely to think that it’s important to teach children to obey, instead arguing that the most important thing a child can do is “think for him or herself.” Millennials are also more likely than any other living generation to consider themselves political independents and be unaffiliated with an organized religion, even if they believe in God. We say we value uniqueness and are critical of those who demand obedience to others’ visions or social norms.

Paradoxically, it’s now conformist to be an individualist and deviant to be conformist. So much so that a subculture emerged to promote blending in. “Normcore,” it makes opting into conformity a virtue. As one commentator described it, “Normcore finds liberation in being nothing special…”

Obviously LSD didn’t do this all by itself, but it was certainly in the right place at the right time. And as a symbol of the radical transition that began in the 1960s, there’s hardly one better.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Flashback Friday.

In Race, Ethnicity, and Sexuality, Joane Nagel looks at how these characteristics are used to create new national identities and frame colonial expansion. In particular, White female sexuality, presented as modest and appropriate, was often contrasted with the sexuality of colonized women, who were often depicted as promiscuous or immodest.

This 1860s advertisement for Peter Lorillard Snuff & Tobacco illustrates these differences. According to Toby and Will Musgrave, writing in An Empire of Plants, the ad drew on a purported Huron legend of a beautiful white spirit bringing them tobacco.

There are a few interesting things going on here. We have the association of femininity with a benign nature: the women are surrounded by various animals (monkeys, a fox and a rabbit, among others) who appear to pose no threat to the women or to one another. The background is lush and productive.

Racialized hierarchies are embedded in the personification of the “white spirit” as a White woman, descending from above to provide a precious gift to Native Americans, similar to imagery drawing on the idea of the “white man’s burden.”

And as often occurred (particularly as we entered the Victorian Era), there was a willingness to put non-White women’s bodies more obviously on display than the bodies of White women. The White woman above is actually less clothed than the American Indian woman, yet her arm and the white cloth are strategically placed to hide her breasts and crotch. On the other hand, the Native American woman’s breasts are fully displayed.

So, the ad provides a nice illustration of the personification of nations with women’s bodies, essentialized as close to nature, but arranged hierarchically according to race and perceived purity.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

The percent of carless households in any given city correlates very well with the percent of homes built before 1940. So what happened in the 40s?

According to Left for LeDroit, it was suburbs:

The suburban housing model was — and, for the most part, still is — based on several main principles, most significantly, the uniformity of housing sizes (usually large) and the separation of residential and commercial uses. Both larger lots and the separation of uses create longer distances between any two points, requiring a greater effort to go between home, work, and the grocery store.

These longer distances between daily destinations made walking impractical and the lower population densities made public transit financially unsustainable. The only solution was the private automobile, which, coincidentally, benefited from massive government subsidies in the form of highway building and a subsidized oil infrastructure and industry.

Neighborhoods designed after World War II are designed for cars, not pedestrians; the opposite is true for neighborhoods designed before 1940. Whether or not one owns a car, and how far one drives if they do, then, is dependent on the type of city, not personal characteristics like environmental friendliness.  Ezra Klein puts it nicely:

In practice, this doesn’t feel like a decision imposed by the cold realities of infrastructure. We get attached to our cars. We get attached to our bikes. We name our subway systems. We brag about our short walks to work. People attach stories to their lives. But at the end of the day, they orient their lives around pretty practical judgments about how best to live. If you need a car to get where you’re going, you’re likely to own one. If you rarely use your car, have to move it a couple of times a week to avoid street cleaning, can barely find parking and have trouble avoiding tickets, you’re going to think hard about giving it up. It’s not about good or bad or red or blue. It’s about infrastructure.

Word.

Neither Ezra nor Left for LeDroit, however, point out that every city, whether it was built for pedestrians or cars, is full of people without cars. In the case of car-dependent cities, this is mostly people who can’t afford to buy or own a car. And these people, in these cities, are royally screwed. Los Angeles, for example, is the most expensive place in the U.S. to own a car and residents are highly car-dependent; lower income people who can’t afford a car must spend extraordinary amounts of time using our mediocre public transportation system, such that carlessness contributes significantly to unemployment.

Originally posted in 2010.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Originally posted at Family Inequality.

It looks like the phrase “start a family” started to mean “have children” (after marriage) sometime in the 1930s and didn’t catch on till the 1940s or 1950s, which happens to be the most pro-natal period in U.S. history. Here’s the Google ngrams trend for the phrase as percentage of all three-word phrases in American English:

startfamngram

Searching the New York Times, I found the earliest uses applied to fish (1931) and plants (1936).

Twitter reader Daniel Parmer relayed a use from the Boston Globe on 8/9/1937, in which actress Merle Oberon said, “I hope to be married within the next two years and start a family. If not, I shall adopt a baby.”

Next appearance in the NYT was 11/22/1942, in a book review in which a man marries a woman and “brings her home to start a family.” After that it was 1948, in this 5/6/1948 description of those who would become baby boom families, describing a speech by Ewan Clague, the Commissioner of Labor Statistics, who is remembered for introducing statistics on women and families into Bureau of Labor Statistics reports. From NYT:

claguenyt

That NYT reference is interesting because it came shortly after the first use of “start a family” in the JSTOR database that unambiguously refers to having children, in a report published by Clague’s BLS:

Trends of Employment and Labor Turn-Over: Monthly Labor Review, Vol. 63, No. 2 (AUGUST 1946): …Of the 584,000 decline in the number of full-time Federal employees between June 1, 1945 and June 1, 1946, almost 75 percent has been in the women’s group. On June 1, 1946, there were only 60 percent as many women employed full time as on June 1, 1945. Men now constitute 70 percent of the total number of full-time workers, as compared with 61 percent a year previously. Although voluntary quits among women for personal reasons, such as to join a veteran husband or to start a family, have been numerous, information on the relative importance of these reasons as compared with involuntary lay-offs is not available…

It’s interesting that, although this appears to be a pro-natal shift, insisting on children before the definition of family is met, it also may have had a work-and-family implication of leaving the labor force. Maybe it reinforced the naturalness of women dropping out of paid work when they had children, something that was soon to emerge as a key battle ground in the gender revolution.

Philip N. Cohen, PhD is a professor of sociology at the University of Maryland, College Park. He writes the blog Family Inequality and is the author of The Family: Diversity, Inequality, and Social Change. You can follow him on Twitter or Facebook.

Note: Rose Malinowski Weingartner, a student in Cohen’s graduate seminar last year, wrote a paper about this concept, which helped him think about this.

Flashback Friday, in honor of Kathrine Switzer running the Boston marathon 50 years after she was physically removed from the race because it was Men Only.

The first Olympic marathon was held in 1896. It was open to men only and was won by a Greek named Spyridon Louis. A woman named Melpomene snuck onto the marathon route. She finished an hour and a half behind Louis, but beat plenty of men who ran slower or dropped out.

Women snuck onto marathon courses from that point forward. Resistance to their participation was strong and, I believe, reflects men’s often unconscious fear that women might in fact be their equals. Why else would they so vociferously object to women’s participation? If women are, indeed, so weak and inferior, what’s to fear from their running alongside men?

Illustrating what seems to be a degree of panic above and beyond an imperative to follow the rules, the two photos  below show the response to Syracuse University Katherine Switzer’s running the man-only Boston marathon in 1967 (Switzer registered for the marathon using her initials). After two miles, race officials realized one of their runners was a girl. Their response? To physically remove her from the race. Luckily, some of her male Syracuse teammates body blocked their grab:

Why not let her run? The race was man-only, so her stats, whatever they may be, were invalid. Why take her out of the race by force? For the same reason that women were excluded to begin with: their actual potential is not obviously inferior to men’s. If it were, there’d be no risk in letting her run. The only sex that is threatened by co-ed sports is the sex whose superiority is assumed.

Women were allowed to begin competing in marathons starting in 1972 — not so very long ago — and, just like Melponeme, while they’ve been slower on average, individual women have been beating individual men ever since. In fact, women have been getting faster and faster, shrinking the gender gap in completion times, because achievement and opportunity go hand in hand.

Thanks Kathrine Switzer, and congratulations.

Originally posted in 2012.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Sometimes you have to take the long view.

This week Bill O’Reilly — arguably the most powerful political commentator in America — was let go from his position at Fox News. The dismissal came grudgingly. News broke that he and Fox had paid out $13 million dollars to women claiming O’Reilly sexually harassed them; Fox didn’t budge. They renewed his contract. There was outcry and protests. The company yawned. But when advertisers started dropping The O’Reilly Factor, they caved. O’Reilly is gone.

Fox clearly didn’t care about women — not “women” in the abstract, nor the women who worked at their company — but they did care about their bottom line. And so did the companies buying advertising space, who decided that it was bad PR to prop up a known sexual harasser. Perhaps the decision-makers at those companies also thought it was the right thing to do. Who knows.

Is this progress?

Donald Trump is on record gleefully explaining that being a celebrity gives him the ability to get away with sexual battery. That’s a crime, defined as unwanted contact with an “intimate part of the body” that is done to sexually arouse, gratify, or abuse. He’s president anyway.

And O’Reilly? He walked away with $25 million in severance, twice what all of his victims together have received in hush money. Fox gaves Roger Ailes much more to go away: $40 million. Also ousted after multiple allegations of sexual harassment, his going away present was also twice what the women he had harassed received.

Man, sexism really does pay.

But they’re gone. Ailes and O’Reilly are gone. Trump is President but Billy Bush, the Today host who cackled when Trump said “grab ’em by the pussy,” was fired, too.  Bill Cosby finally had some comeuppance after decades of sexual abuse and rape. At the very least, his reputation is destroyed. Maybe these “victories” — for women, for feminists, for equality, for human decency — were driven purely by greed. And arguably, for all intents and purposes, the men are getting away with it. Trump, Ailes, O’Reilly, Bush, and Cosby are all doing fine. Nobody’s in jail; everybody’s rich beyond belief.

But we know what they did.

Until at least the 1960s, sexual harassment — along with domestic violence, stalking, sexual assault, and rape — went largely unregulated, unnoticed, and unnamed. There was no language to even talk about what women experienced in the workplace. Certainly no outrage, no ruined reputations, no dismissals, and no severance packages. The phrase “sexual harassment” didn’t exist.

In 1964, with the passage of the Civil Rights Act, it became illegal to discriminate against women at work, but only because the politicians who opposed the bill thought adding sex to race, ethnicity, national origin, and religion would certainly tank it. That’s how ridiculous the idea of women’s rights was at the time. But that was then. Today almost no one thinks women shouldn’t have equal rights at work.

What has happened at Fox News, in Bill Cosby’s hotel rooms, in the Access Hollywood bus, and on election day is proof that sexism is alive and well. But it’s not as healthy as it once was. Thanks to hard work by activists, politicians, and citizens, things are getting better. Progress is usually incremental. It requires endurance. Change is slow. Excruciatingly so. And this is what it looks like.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

4The Numbers

Some History

The Winners and the Losers

Tax Cultures

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Flashback Friday.

Bewildered by Nazi soldiers’ willingness to perpetuate the horrors of World War II, Stanley Milgram set out to test the extent to which average people would do harm if instructed by an authority figure. In what would end up being one of the most famous studies in the history of social psychology, the experimenter would instruct study subjects to submit a heard, but unseen stranger (who was reputed to have a heart condition) to a series of increasingly strong electric shocks. The unseen stranger (actually a tape recording) would yelp and cry and scream and beg… and eventually be silent. If the study subject expressed a desire to quit administering the shocks, the experimenter would prod four times:

1. Please continue.
2. The experiment requires that you continue.
3. It is absolutely essential that you continue.
4. You have no other choice, you must go on.

If, after four prods, the subject still refused to administer the shock, the experiment was over.

In his initial study, though all participants at some point required prodding, 65 percent of people (26 out of 40) continued to submit the stranger to electric shocks all the way up to (a fake) 450-volts, a dose that was identified as fatal and was administered after the screaming turned to silence. You can watch a BBC replication of the studies.

Originally posted in August 2010.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.