culture

Men and women in Western societies often look more different than they are naturally because of the incredible amounts of work we put into trying to look different.  Often this is framed as “natural” but, in fact, it takes a lot of time, energy, and money.  The dozens of half-drag portraits, from photographer Leland Bobbé, illustrate just how powerful our illusion can be.  Drag, of course, makes a burlesque of the feminine; it is hyperfeminine.  But most all of us are doing drag, at least a little bit, much of the time. 

Here’s an example of one we have permission to use for the cover of our Gender textbook:

1

Many more at Leland Bobbé’s website.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Earlier this week I wrote a post asking Is the Sky Blue?, discussing the way that culture influences our perception of color.  In the comments thread Will Robertson linked to a fascinating 8-minute BBC Horizon clip.  The video features an expert explaining how language changes how children process color in the brain.

We also travel to Namibia to visit with the Himba tribe.  They have different color categories than we do in the West, making them “color blind” to certain distinctions we easily parse out, but revealing ways in which we, too, can be color blind.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

When we categorize people into “races,” we do so using a number of physical characteristics, but especially skin color. Our racial system is based on the idea that skin color is a clearly distinguishing trait, especially when we use terms like “black” and “white,” which we generally conceive of as opposite colors.

Of course, because race is socially constructed, there’s actually enormous diversity within the categories we’ve created, and great overlap between them, as we’ve forced all humans on earth into just a few groupings.  And terms like “black” and “white” don’t really describe the shades of actual human skin.

Artist and photographer Angelica Dass has an art project, Humanae, that illustrates the tremendous diversity in skin color (via co.CREATE, sent in by Dolores R., Mike R., and YetAnotherGirl. She uses an 11×11 pixel of individuals’ faces to match them to a specific color in the Pantone color system, which catalogs thousands of hues and is used in many types of manufacturing to standardize and match colors. She then takes a photo of them in front of a background of their Pantone color.

Currently the project is very heavily focused on people we’d generally categorize as White — there are a few individuals from other groups, but not many, and in no way does it represent “every skin tone,” as I’ve seen it described in some places. So that’s a major caveat.

That said, I do think the project shows how reductive our system of classifying people by skin tone is, when you look at the range of colors even just among Whites — why does it make sense to throw most of these people into one category and say they’re all physically the same in a meaningful way that separates them from everyone else (and then connect those supposedly shared physical traits to non-physical ones)? And which part of the body do we use to do so, since many of us have various shades on our bodies? Or which time of year, since many of us change quite a bit between summer and winter?

Maru sent in a similar example; French artist Pierre David created “The Human Pantone,” using 40 models. We think racial categories make sense because we generally think of the extremes, but by showing individuals arranged according to hue, the project highlights the arbitrariness of racial boundaries. Where, exactly, should the dividing lines be?

Via TAXI.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Food shortages during World War II required citizens and governments to get creative, changing the gastronomical landscape in surprising ways.   Many ingredients that the British were accustomed to were unavailable.  Enter the carrot.

According to my new favorite museum, the Carrot Museum, carrots were plentiful, but the English weren’t very familiar with the root.  Wrote the New York Times in 1942: “England has a goodly store of carrots. But carrots are not the staple items of the average English diet. The problem…is to sell the carrots to the English public.”

So the British government embarked on a propaganda campaign designed to increase dependence on carrots.  It linked carrot consumption to patriotism, disseminated recipes, and made bold claims about the carrot’s ability to improve your eyesight (useful considering they were often in blackout conditions).

Here’s a recipe for Carrot Fudge:

You will need:

  • 4 tablespoons of finely grated carrot
  • 1 gelatine leaf
  • orange essence or orange squash
  • a saucepan and a flat dish

Put the carrots in a pan and cook them gently in just enough water to keep them covered, for ten minutes. Add a little orange essence, or orange squash to flavour the carrot. Melt a leaf of gelatine and add it to the mixture. Cook the mixture again for a few minutes, stirring all the time. Spoon it into a flat dish and leave it to set in a cool place for several hours. When the “fudge” feels firm, cut it into chunks and get eating!

Disney created characters in an effort to help:

The government even used carrots as part of an effort to misinform their enemies:

…Britain’s Air Ministry spread the word that a diet of carrots helped pilots see Nazi bombers attacking at night. That was a lie intended to cover the real matter of what was underpinning the Royal Air Force’s successes: the latest, highly efficient on board,  Airborne Interception Radar, also known as AI.

When the Luftwaffe’s bombing assault switched to night raids after the unsuccessful daylight campaign, British Intelligence didn’t want the Germans to find out about the superior new technology helping protect the nation, so they created a rumour to afford a somewhat plausible-sounding explanation for the sudden increase in bombers being shot down… The Royal Air Force bragged that the great accuracy of British fighter pilots at night was a result of them being fed enormous quantities of carrots and the Germans bought it because their folk wisdom included the same myth.

But here’s the most fascinating part.

It turns out that, exactly because of the rationing, British people of all classes ate healthier.

…many poor people had been too poor to feed themselves properly, but with virtually no unemployment and the introduction of rationing, with its fixed prices, they ate better than in the past.

Meanwhile, among the better off, rationing reduced the intake of unhealthy foods.  There were very few sweets available and people ate more vegetables and fewer fatty foods.  As a result “…infant mortality declined and life expectancy increased.”

I love carrots. I’m eating them right now.

To close, here are some kids eating carrots on a stick:

Via Retronaut.  For more on life during World War II, see our posts on staying off the phones and carpool propaganda (“When You Ride ALONE, You Ride With Hitler!”) and our coverage of life in Japanese Internment Camps, women in high-tech jobs, the demonization of prostitutes, and the German love/hate relationship with jazz.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

A while back I was summoned for jury duty and found myself being considered for a case against a young Latina with a court translator.  She was accused of selling counterfeit Gucci and Chanel purses on the street in L.A.  After introducing the case, the judge asked: “Is any reason why you could not objectively apply the law?” My hand shot up.

I said:

I have to admit, I’m kind of disgusted that state resources are being used to protect the corporate interests of Chanel and Gucci.

Then I gave a spiel about corruption in the criminal justice system and finished up with:

I think that society should be protecting its weakest members, not penalizing them for trivial infractions. There is no way in good conscience I could give that girl a criminal record, I don’t care if she’s guilty. Some things are more important than the rules.

I was summarily dismissed.

Criminal prosecutions are one way to decrease counterfeiting and, yes, protect corporate interests and Shaynah H. sent in another: shame.  This National Crime Prevention Council/Bureau of Justice Assistance ad, spotted in a mall in Portland, tells you that if you buy knock-offs, you are “a phony.”

Yikes.  I would have preferred “savvy” or “cost-conscious.”  But, no, the message is clear.  You are a fake person, a liar, a hypocrite.  You are insincere and pretentious.  You are an impostor.  (All language borrowed from the word’s definition.)  And these are not something that anyone wants to be.

But, honestly, why does anyone care?

I suspect that counterfeits don’t really cut into Chanel’s profits directly.  The people who buy bags that costs thousands of dollars are not going to try to save some pennies by buying a knock-off.  Or, to put it the inverse way, the people who are buying the counterfeits wouldn’t suddenly be buying the originals if their supply ran out.

Instead, policing the counterfeiters is a response to a much more intangible concern, something Pierre Bourdieu called “cultural capital.”  You see, a main reason why people spend that kind of money on handbags is to be seen as the kind of person who does.  The handbags are a signal to others that they are “that kind” of person, the kind that can afford a real Gucci.  The products, then, are ways that people put boundaries between themselves and lesser others.

But, when lesser others can buy knock-offs on the street in L.A. and just parade around as if they can buy Gucci too!  Well, then the whole point of buying Gucci is lost!  If the phony masses can do it, it no longer serves to distinguish the elites from the rest of us.

In this sense, Chanel and Gucci are very interested in reducing counterfeiting; the rich people who buy their products will only do so if buying them proves that they’re special.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

The term sexual dimorphism refers to differences between males and females of the same species.  Some animals are highly sexually dimorphic. Male elephant seals outweigh females by more than 2,500 pounds; peacocks put on a color show that peahens couldn’t mimic in their wildest dreams; and a male anglerfish’s whole life involves finding a female, latching on, and dissolving until there’s nothing left but his testicles (yes, really).

On the spectrum of very high to very low dimorphism, humans are on the low end.  We’re just not that kind of species.  Remove the gendered clothing styles, make up, and hair differences and we’d look more alike than we think we do.

Because we’re invested in men and women being different, however, we tend to be pleased by exaggerated portrayals of human sexual dimorphism (for example, in Tangled). Game designer-in-training Andrea Rubenstein has shown us that we extend this ideal to non-human fantasy as well.  She points to a striking dimorphism (mimicking Western ideals) in World of Warcraft creatures:

Annalee Newitz at Wired writes:

[Rubenstein] points out that these female bodies embody the “feminine ideal” of the supermodel, which seems a rather out-of-place aesthetic in a world of monsters. Supermodelly Taurens wouldn’t be so odd if gamers had the choice to make their girl creatures big and muscley, but they don’t. Even if you wanted to have a female troll with tusks, you couldn’t. Which seems especially bizarre given that this game is supposed to be all about fantasy, and turning yourself into whatever you want to be.

It appears that the supermodel-like females weren’t part of the original design of the game.  Instead, the Alpha version included a lot less dimorphism, among the Taurens and the Trolls for example:

Newitz says that the female figures were changed in response to player feedback:

Apparently there were many complaints about the women of both races being “ugly” and so the developers changed them into their current incarnations.

The dimorphism in WoW is a great example of how gender difference is, in part, an ideology.  It’s a desire that we impose onto the world, not reality in itself.  We make even our fantasy selves conform to it.  Interestingly, when people stray from affirming the ideology, they can face pressure to align themselves with its defenders.  It appears that this is exactly what happened in WoW.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at Jezebel.

I’ve been watching the response to Anne-Marie Slaughter’s Why Women Still Can’t Have It All roll out across the web.  Commentators are making excellent points, but E.J. Graff at The American Prospect sums it up nicely:

Being both a good parent and an all-out professional cannot be done the way we currently run our educational and work systems… Being a working parent in our society is structurally impossible. It can’t be done right… You’ll always be failing at something — as a spouse, as a parent, as a worker. Just get used to that feeling.

In other words, the cards are stacked against you and it’s gonna suck.

And it’s true, trust me, as someone who’s currently knee-deep in the literature on parenting and gender, I’m pleased to see the structural contradictions between work and parenting being discussed.

But I’m frustrated about an invisibility, an erasure, a taboo that goes unnamed.  It seems like it should at least get a nod in this discussion.  I’m talking about the one really excellent solution to the clusterf@ck that is parenting in America.

Don’t. Have. Kids.

No really — just don’t have them.

Think about it.  The idea that women will feel unfulfilled without children and die from regret is one of the most widely-endorsed beliefs in America.  It’s downright offensive to some that a woman would choose not to have children.  Accusations of “selfishness” abound.  It’s a given that women will have children, and many women will accept it as a given.

But we don’t have to.  The U.S. government fails to support our childrearing efforts with sufficient programs (framing it as a “choice” or “hobby”), the market is expensive (child care costs more than college in most states), and we’re crammed into nuclear family households (making it difficult to rely on extended kin, real or chosen).  And the results are clear: raising children changes the quality of your life.  In good ways, sure, but in bad ways too.

Here are findings from the epic data collection engine that is the World Values Survey, published in Population and Development Review. If you live in the U.S., look at the blue line representing “liberal” democracies (that’s what we are).  The top graph shows that, among 20-39 year olds, having one child is correlated with a decrease in happiness, having two a larger decreases, and so on up to four or more.  If you’re 40 or older, having one child is correlated with a decrease in happiness and having more children a smaller one.  But even the happiest people, with four or more children, are slightly less happy than those with none at all.

Don’t shoot the messenger.

Long before Slaughter wrote her article for The Atlantic, when she floated the idea of writing it to a female colleague, she was told that it would be a “terrible signal to younger generations of women.”  Presumably, this is because having children is compulsory, so it’s best not to demoralize them.  Well, I’ll take on that Black Badge of Dishonor.  I’m here to tell still-childless women (and men, too) that they can say NO if they want to.  They can reject a lifetime of feeling like they’re “always… failing at something.”

I wish it were different. I wish that men and women could choose children and know that the conditions under which they parent will be conducive to happiness.  But they’re not.  As individuals, there’s little we can do to change this, especially in the short term.  We can, however, try to wrest some autonomy from the relentless warnings that we’ll be pathetically-sad-forever-and-ever if we don’t have babies.  And, once we do that, we can make a more informed measurement of the costs and benefits.

Some of us will choose to spend our lives doing something else instead.  We’ll learn to play the guitar, dance the Flamenco (why not?), get more education, travel to far away places, write a book, or start a welcome tumblr.  We can help raise our nieces and nephews, easing the burden on our loved ones, or focus on nurturing our relationships with other adults.  We can live in the cool neighborhoods with bad school districts and pay less in rent because two bedrooms are plenty.  We can eat out, sleep in, and go running.  We can have extraordinary careers, beautiful relationships, healthy lives, and lovely homes.  My point is: there are lots of great things to do in life… having children is only one of them.

Just… think about it.  Maybe you can spend your extra time working to change the system for the better.  Goodness knows parents will be too tired to do it.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

The Washington Post has provided an image from the New England Journal of Medicine that illustrates changing causes of death. Comparing the top 10 causes of death in 1900 and 2010 (using data from the Centers for Disease Control and Prevention), we see first that mortality rates have dropped significantly, with deaths from the top 10 causes combined dropping from about 1100/100,000 to about 600/100,000:

And not surprisingly, what we die from has changed, with infectious diseases decreasing and being replaced by so-called lifestyle diseases. Tuberculosis, a scourge in 1900, is no longer a major concern for most people in the U.S. Pneumonia and the flu are still around, but much less deadly than they used to be. On the other hand, heart disease has increased quite a bit, though not nearly as much as cancer.

The NEJM has an interactive graph that lets you look at overall death rates for every decade since 1900, as well as isolate one or more causes. For instance, here’s a graph of mortality rates fro pneumonia and influenza, showing the general decline over time but also the major spike in deaths caused by the 1918 influenza epidemic:

The graphs accompany an article looking at the causes of death described in the pages of NEJM since its founding in 1812; the overview highlights the social context of the medical profession. In 1812, doctors had to consider the implications of a near-miss by a cannonball, teething could apparently kill you, and doctors were concerned with a range of fevers, from bilious to putrid. By 1912, the medical community was explaining disease in terms of microbes, the population had gotten healthier, and an editorial looked forward to a glorious future:

Perhaps in 1993, when all the preventable diseases have been eradicated, when the nature and cure of cancer have been discovered, and when eugenics has superseded evolution in the elimination of the unfit, our successors will look back at these pages with an even greater measure of superiority.

As the article explains, the field of medicine is inextricably connected to larger social processes, which both influence medical practice and can be reinforced by definitions of health and disease:

Disease definitions structure the practice of health care, its reimbursement systems, and our debates about health policies and priorities. These political and economic stakes explain the fierce debates that erupt over the definition of such conditions as chronic fatigue syndrome and Gulf War syndrome. Disease is a deeply social process. Its distribution lays bare society’s structures of wealth and power, and the responses it elicits illuminate strongly held values.