history

Black history in Appalachia is largely hidden. Many people think that slavery was largely absent in central and southern Appalachia due to the poverty of the Scots-Irish who frequently settled in the area, and who were purportedly more “ruggedly independent” and pro-abolitionist in their sentiments. Others argue that the mountainous land was not appropriate for plantations, unlike other parts of the South, and so slavery in the area was improbable.

A Sample Slave Schedule
(Wikimedia Commons)

As historian John Inscoe and sociologist Wilma Dunaway show us, this is not the case. According to Inscoe, slavery existed in “every county in Appalachia in 1860.” Dunaway—who collected data from county tax lists, census manuscripts, records from slaveholders, and slave narratives from the area—estimates that 18% of Appalachian households owned slaves, which compares to approximately 29% of Southern families, in general.

While enslaved people in the Appalachian region were less likely to work on large plantations, their experiences were no less harsh. They often tended small farms and livestock, worked in manufacturing and commerce, served tourists, and labored in mining industries. Slave narratives, legal documents, and other records all show that slaves in Appalachia were treated harshly and punitively, despite claims that slavery was more “genteel” in the area than the deep South.

My own research, which focuses on the life experiences of Leslie [“Les”] Whittington, whose grandfather was enslaved, helps to document the presence of slavery in Appalachia and the consequences that exploitative system had for African Americans in the region. Les’s grandfather, John Myra, was owned by Joseph Stepp, who lived in Western North Carolina. Census records show that Joseph Stepp owned seven slaves in 1850, five women and two men, who together ranged from one to 32 years of age. Ten years later, in 1860, schedules show Stepp owned 21 slaves, making him one of the wealthiest property owners in Buncombe County, the county in which he and John Myra lived.

Joseph Stepp was not unique. According to Dunaway, slave owners in Appalachia “monopolized a much higher proportion of their communities’ land and wealth” compared to those outside the area, driving wealth inequality in the region. Part of the legacy of slavery, these inequities remained in place after the Civil War, reinforced by Jim Crow legislation that subjugated African Americans socially, culturally, and politically. Sociologist Karida Brown explains how Jim Crow Laws led approximately six million African Americans to migrate from the South to the North between 1910 and 1970.

Poverty Rates in Appalachia by Race (U.S. Census Bureau, 2000).
Click to view report
Graphic by Evan Stewart

Those who stayed in Appalachia, such as John Myra and his descendants, faced continued restrictions, like living in racially segregated neighborhoods, having limited employment opportunities, and not being able to attend racially integrated schools. Such systematic forms of discrimination explain why racial disparities continue to exist today, even within a region where poverty among whites remains above the national average. To understand these existing inequities, we must document the past accurately.

Jacqueline Clark, PhD is a professor of sociology at Ripon College. Her teaching and research interests include social inequalities, the sociology of health and illness, and the sociology of jobs and work. 

In February of 1926, Carter G. Woodson helped establish “Negro History Week” to educate teachers, students, and community members about the accomplishments and experiences of Blacks in the United States. A native of Virginia, and the son of formerly enslaved parents, Woodson earned a PhD in history from Harvard University, and dedicated much of his life to writing and teaching about information largely omitted from textbooks and other historical accounts. Although Woodson died in 1950, his legacy continues, as “Negro History Week” eventually became “Black History Month” in 1976.

Nearly a century later, Black History is still at risk of erasure, especially in (once) geographically isolated areas, like Appalachia. The standard narrative that Scots-Irish “settled” Appalachia starting in the 18th century hides the fact that there were often violent interactions between European immigrants and indigenous people in the region. Even in the 1960s when authors like Michael Harrington and Harry Caudill reported on Appalachian mountain folk, the people were depicted as Scots-Irish descendants, known for being poor, lazy, and backward, representations that are reinforced in contemporary accounts of the region, such as J. D. Vance’s wildly popular memoir Hillbilly Elegy.

Source: Wikimedia Commons
Source: Wikimedia Commons

Accounts like these offer stereotypical understandings of poor Appalachian whites, and at the same time, they ignore the presence and experiences of Blacks in the region. Work by social scientists William Turner and Edward Cabell, as well as “Affrilachia” poet Frank X. Walker, and historian Elizabeth Catte attempts to remedy this problem, but the dominant narrative of the region centers still on poor whites and their lives.

Work I have been doing documenting the life experiences of Leslie [“Les”] Whittington, a native of Western North Carolina and a descendent of a formerly enslaved people, has opened my eyes to a historical narrative I never fully knew. African Americans, for instance, accounted for approximately 10% of the Appalachian region’s population by 1860, and many were enslaved, including Les’ grandfather, John Myra Stepp. Yet, their stories are glaringly missing from the dominant narrative of the region.

Source: Appalachian Regional Commission Census Data Overview

So too are the stories of Blacks living in Appalachia today. Even though the number of African American residents has increased in some parts of  Appalachia, while the white population has decreased, little is formally documented about their lives. That absence has led scholar William Turner, to refer to Blacks in Appalachia as a “racial minority within a cultural minority.” Not only does erasing African Americans from the past and present of Appalachia provide an inaccurate view of the region, but it also minimizes the suffering of poor Blacks, who relative to their white counterparts, are and have been the poorest of an impoverished population.

Woodson established “Negro History Week” to document and share the history of Blacks in the United States, recognizing that, “If a race has no history, it has no worthwhile tradition, it becomes a negligible factor in the thought of the world, and it stands in danger of being exterminated.” The history of African Americans in the Appalachian region is largely absent from the area’s official record, and without making it part of the dominant narrative, we risk losing that history.

Jacqueline Clark, PhD is a professor of sociology at Ripon College. Her teaching and research interests include social inequalities, the sociology of health and illness, and the sociology of jobs and work. 

The first nice weekend after a long, cold winter in the Twin Cities is serious business. A few years ago some local diners joined the celebration with a serious indulgence: the boozy milkshake.

When talking with a friend of mine from the Deep South about these milkshakes, she replied, “oh, a bushwhacker! We had those all the time in college.” This wasn’t the first time she had dropped southern slang that was new to me, so off to Google I went.

According to Merriam-Webster, “to bushwhack” means to attack suddenly and unexpectedly, as one would expect the alcohol in a milkshake to sneak up on you. The cocktail is a Nashville staple, but the origins trace back to the Virgin Islands in the 1970s.

Photo Credit: Beebe Bourque, Flickr CC
Photo Credit: Like_the_Grand_Canyon, Flickr CC

Here’s the part where the history takes a sordid turn: “Bushwhacker” was apparently also the nickname for guerrilla fighters in the Confederacy during the Civil War who would carry out attacks in rural areas (see, for example, the Lawrence Massacre). To be clear, I don’t know and don’t mean to suggest this had a direct influence in the naming of the cocktail. Still, the coincidence reminded me of the famous, and famously offensive, drinking reference to conflict in Northern Ireland.

Battle of Lawrence, Wikimedia Commons

When sociologists talk about concepts like “cultural appropriation,” we often jump to clear examples with a direct connection to inequality and oppression like racist halloween costumes or ripoff products—cases where it is pretty easy to look at the object in question and ask, “didn’t they think about this for more than thirty seconds?”

Cases like the bushwhacker raise different, more complicated questions about how societies remember history. Even if the cocktail today had nothing to do with the Confederacy, the weight of that history starts to haunt the name once you know it. I think many people would be put off by such playful references to modern insurgent groups like ISIS. Then again, as Joseph Gusfield shows, drinking is a morally charged activity in American society. It is interesting to see how the deviance of drinking dovetails with bawdy, irreverent, or offensive references to other historical and social events. Can you think of other drinks with similar sordid references? It’s not all sex on the beach!

Evan Stewart is a Ph.D. candidate in sociology at the University of Minnesota. You can follow him on Twitter.

In the 1950s and ’60s, a set of social psychological experiments seemed to show that human beings were easily manipulated by low and moderate amounts of peer pressure, even to the point of violence. It was a stunning research program designed in response to the horrors of the Holocaust, which required the active participation of so many people, and the findings seemed to suggest that what happened there was part of human nature.

What we know now, though, is that this research was undertaken at an unusually conformist time. Mothers were teaching their children to be obedient, loyal, and to have good manners. Conformity was a virtue and people generally sought to blend in with their peers. It wouldn’t last.

At the same time as the conformity experiments were happening, something that would contribute to changing how Americans thought about conformity was being cooked up: the psychedelic drug, LSD.

Lysergic acid diethylamide was first synthesized in 1938 in the routine process of discovering new drugs for medical conditions. The first person to discover it psychedelic properties — its tendency to alter how we see and think — was the scientist who invented it, Albert Hoffmann. He ingested it accidentally, only to discover that it induces a “dreamlike state” in which he “perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.”

By the 1950s , LSD was being administered to unwitting American in a secret, experimental mind control program conducted by the United States Central Intelligence Agency, one that would last 14 years and occur in over 80 locations. Eventually the fact of the secret program would leak out to the public, and so would LSD.

It was the 1960s and America was going through a countercultural revolution. The Civil Rights movement was challenging persistent racial inequality, the women’s and gay liberation movements were staking claims on equality for women and sexual minorities, the sexual revolution said no to social rules surrounding sexuality and, in the second decade of an intractable war with Vietnam, Americans were losing patience with the government. Obedience had gone out of style.

LSD was the perfect drug for the era. For its proponents, there was something about the experience of being on the drug that made the whole concept of conformity seem absurd. A new breed of thinker, the “psychedelic philosopher,” argued that LSD opened one’s mind and immediately revealed the world as it was, not the world as human beings invented it. It revealed, in other words, the social constructedness of culture.

In this sense, wrote the science studies scholar Ido Hartogsohn, LSD was truly “countercultural,” not only “in the sense of being peripheral or opposed to mainstream culture [but in] rejecting the whole concept of culture.” Culture, the philosophers claimed, shut down our imagination and psychedelics were the cure. “Our normal word-conditioned consciousness,” wrote one proponent, “creates a universe of sharp distinctions, black and white, this and that, me and you and it.” But on acid, he explained, all of these rules fell away. We didn’t have to be trapped in a conformist bubble. We could be free.

The cultural influence of the psychedelic experience, in the context of radical social movements, is hard to overstate. It shaped the era’s music, art, and fashion. It gave us tie-dye, The Grateful Dead, and stuff like this:


via GIPHY

The idea that we shouldn’t be held down by cultural constrictions — that we should be able to live life as an individual as we choose — changed America.

By the 1980s, mothers were no longer teaching their children to be obedient, loyal, and to have good manners. Instead, they taught them independence and the importance of finding one’s own way. For decades now, children have been raised with slogans of individuality: “do what makes you happy,” “it doesn’t matter what other people think,” “believe in yourself,” “follow your dreams,” or the more up-to-date “you do you.”

Today, companies choose slogans that celebrate the individual, encouraging us to stand out from the crowd. In 2014, for example, Burger King abandoned its 40-year-old slogan, “Have it your way,” for a plainly individualistic one: “Be your way.” Across the consumer landscape, company slogans promise that buying their products will mark the consumer as special or unique. “Stay extraordinary,” says Coke; “Think different,” says Apple. Brands encourage people to buy their products in order to be themselves: Ray-Ban says “Never hide”; Express says “Express yourself,” and Reebok says “Let U.B.U.”

In surveys, Americans increasingly defend individuality. Millennials are twice as likely as Baby Boomers to agree with statements like “there is no right way to live.” They are half as likely to think that it’s important to teach children to obey, instead arguing that the most important thing a child can do is “think for him or herself.” Millennials are also more likely than any other living generation to consider themselves political independents and be unaffiliated with an organized religion, even if they believe in God. We say we value uniqueness and are critical of those who demand obedience to others’ visions or social norms.

Paradoxically, it’s now conformist to be an individualist and deviant to be conformist. So much so that a subculture emerged to promote blending in. “Normcore,” it makes opting into conformity a virtue. As one commentator described it, “Normcore finds liberation in being nothing special…”

Obviously LSD didn’t do this all by itself, but it was certainly in the right place at the right time. And as a symbol of the radical transition that began in the 1960s, there’s hardly one better.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Flashback Friday.

In Race, Ethnicity, and Sexuality, Joane Nagel looks at how these characteristics are used to create new national identities and frame colonial expansion. In particular, White female sexuality, presented as modest and appropriate, was often contrasted with the sexuality of colonized women, who were often depicted as promiscuous or immodest.

This 1860s advertisement for Peter Lorillard Snuff & Tobacco illustrates these differences. According to Toby and Will Musgrave, writing in An Empire of Plants, the ad drew on a purported Huron legend of a beautiful white spirit bringing them tobacco.

There are a few interesting things going on here. We have the association of femininity with a benign nature: the women are surrounded by various animals (monkeys, a fox and a rabbit, among others) who appear to pose no threat to the women or to one another. The background is lush and productive.

Racialized hierarchies are embedded in the personification of the “white spirit” as a White woman, descending from above to provide a precious gift to Native Americans, similar to imagery drawing on the idea of the “white man’s burden.”

And as often occurred (particularly as we entered the Victorian Era), there was a willingness to put non-White women’s bodies more obviously on display than the bodies of White women. The White woman above is actually less clothed than the American Indian woman, yet her arm and the white cloth are strategically placed to hide her breasts and crotch. On the other hand, the Native American woman’s breasts are fully displayed.

So, the ad provides a nice illustration of the personification of nations with women’s bodies, essentialized as close to nature, but arranged hierarchically according to race and perceived purity.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

The percent of carless households in any given city correlates very well with the percent of homes built before 1940. So what happened in the 40s?

According to Left for LeDroit, it was suburbs:

The suburban housing model was — and, for the most part, still is — based on several main principles, most significantly, the uniformity of housing sizes (usually large) and the separation of residential and commercial uses. Both larger lots and the separation of uses create longer distances between any two points, requiring a greater effort to go between home, work, and the grocery store.

These longer distances between daily destinations made walking impractical and the lower population densities made public transit financially unsustainable. The only solution was the private automobile, which, coincidentally, benefited from massive government subsidies in the form of highway building and a subsidized oil infrastructure and industry.

Neighborhoods designed after World War II are designed for cars, not pedestrians; the opposite is true for neighborhoods designed before 1940. Whether or not one owns a car, and how far one drives if they do, then, is dependent on the type of city, not personal characteristics like environmental friendliness.  Ezra Klein puts it nicely:

In practice, this doesn’t feel like a decision imposed by the cold realities of infrastructure. We get attached to our cars. We get attached to our bikes. We name our subway systems. We brag about our short walks to work. People attach stories to their lives. But at the end of the day, they orient their lives around pretty practical judgments about how best to live. If you need a car to get where you’re going, you’re likely to own one. If you rarely use your car, have to move it a couple of times a week to avoid street cleaning, can barely find parking and have trouble avoiding tickets, you’re going to think hard about giving it up. It’s not about good or bad or red or blue. It’s about infrastructure.

Word.

Neither Ezra nor Left for LeDroit, however, point out that every city, whether it was built for pedestrians or cars, is full of people without cars. In the case of car-dependent cities, this is mostly people who can’t afford to buy or own a car. And these people, in these cities, are royally screwed. Los Angeles, for example, is the most expensive place in the U.S. to own a car and residents are highly car-dependent; lower income people who can’t afford a car must spend extraordinary amounts of time using our mediocre public transportation system, such that carlessness contributes significantly to unemployment.

Originally posted in 2010.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Originally posted at Family Inequality.

It looks like the phrase “start a family” started to mean “have children” (after marriage) sometime in the 1930s and didn’t catch on till the 1940s or 1950s, which happens to be the most pro-natal period in U.S. history. Here’s the Google ngrams trend for the phrase as percentage of all three-word phrases in American English:

startfamngram

Searching the New York Times, I found the earliest uses applied to fish (1931) and plants (1936).

Twitter reader Daniel Parmer relayed a use from the Boston Globe on 8/9/1937, in which actress Merle Oberon said, “I hope to be married within the next two years and start a family. If not, I shall adopt a baby.”

Next appearance in the NYT was 11/22/1942, in a book review in which a man marries a woman and “brings her home to start a family.” After that it was 1948, in this 5/6/1948 description of those who would become baby boom families, describing a speech by Ewan Clague, the Commissioner of Labor Statistics, who is remembered for introducing statistics on women and families into Bureau of Labor Statistics reports. From NYT:

claguenyt

That NYT reference is interesting because it came shortly after the first use of “start a family” in the JSTOR database that unambiguously refers to having children, in a report published by Clague’s BLS:

Trends of Employment and Labor Turn-Over: Monthly Labor Review, Vol. 63, No. 2 (AUGUST 1946): …Of the 584,000 decline in the number of full-time Federal employees between June 1, 1945 and June 1, 1946, almost 75 percent has been in the women’s group. On June 1, 1946, there were only 60 percent as many women employed full time as on June 1, 1945. Men now constitute 70 percent of the total number of full-time workers, as compared with 61 percent a year previously. Although voluntary quits among women for personal reasons, such as to join a veteran husband or to start a family, have been numerous, information on the relative importance of these reasons as compared with involuntary lay-offs is not available…

It’s interesting that, although this appears to be a pro-natal shift, insisting on children before the definition of family is met, it also may have had a work-and-family implication of leaving the labor force. Maybe it reinforced the naturalness of women dropping out of paid work when they had children, something that was soon to emerge as a key battle ground in the gender revolution.

Philip N. Cohen, PhD is a professor of sociology at the University of Maryland, College Park. He writes the blog Family Inequality and is the author of The Family: Diversity, Inequality, and Social Change. You can follow him on Twitter or Facebook.

Note: Rose Malinowski Weingartner, a student in Cohen’s graduate seminar last year, wrote a paper about this concept, which helped him think about this.

Flashback Friday, in honor of Kathrine Switzer running the Boston marathon 50 years after she was physically removed from the race because it was Men Only.

The first Olympic marathon was held in 1896. It was open to men only and was won by a Greek named Spyridon Louis. A woman named Melpomene snuck onto the marathon route. She finished an hour and a half behind Louis, but beat plenty of men who ran slower or dropped out.

Women snuck onto marathon courses from that point forward. Resistance to their participation was strong and, I believe, reflects men’s often unconscious fear that women might in fact be their equals. Why else would they so vociferously object to women’s participation? If women are, indeed, so weak and inferior, what’s to fear from their running alongside men?

Illustrating what seems to be a degree of panic above and beyond an imperative to follow the rules, the two photos  below show the response to Syracuse University Katherine Switzer’s running the man-only Boston marathon in 1967 (Switzer registered for the marathon using her initials). After two miles, race officials realized one of their runners was a girl. Their response? To physically remove her from the race. Luckily, some of her male Syracuse teammates body blocked their grab:

Why not let her run? The race was man-only, so her stats, whatever they may be, were invalid. Why take her out of the race by force? For the same reason that women were excluded to begin with: their actual potential is not obviously inferior to men’s. If it were, there’d be no risk in letting her run. The only sex that is threatened by co-ed sports is the sex whose superiority is assumed.

Women were allowed to begin competing in marathons starting in 1972 — not so very long ago — and, just like Melponeme, while they’ve been slower on average, individual women have been beating individual men ever since. In fact, women have been getting faster and faster, shrinking the gender gap in completion times, because achievement and opportunity go hand in hand.

Thanks Kathrine Switzer, and congratulations.

Originally posted in 2012.

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.