history

Flashback Friday.

In Hearts of Men, Barbara Ehrenreich talks about the launching of Playboy in 1953 and how it forever changed how we thought about single men.

At that time, a man who stayed single was suspected of homosexuality.  The idea of being an unmarried heterosexual adult of sound mind and body was totally foreign.  Hugh Hefner changed all of that by inventing a whole new kind of man, the playboy.  The playboy stayed single (so as to have lots of ladies), kept his money for himself and his indulgences (booze and ladies), and re-purposed the domestic sphere (enter the snazzy bachelor pad full of booze and ladies).

With this in mind, check out an attempt to attract advertising dollars from a 1969 issue (found at Vintage Ads).  It nicely demonstrates Playboy‘s marketing of a new kind of man, one who lives a free and adventurous life that is unburdened by a boring, dead-end job needed to support a wife and kids.

Text:

What sort of man reads Playboy? He’s an entertaining young guy happily living the good life. And loving every adventurous minute of it. One recipe for his upbeat life style? Fun friends and fine potables. Facts. PLAYBOY is read by one of out every three men under 50 who drink alcoholic beverages. Small wonder beverage advertisers invest more dollars in PLAYBOY issue per issue than they do in any other magazine. Need your spirit lifted? This must be the place.

Today, we commonly come across the idea that men are naturally averse to being tied down, but Hefner’s project reveals that this was an idea that was invented quite recently and promulgated for profit.

This post originally appeared in 2008.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Shock, frustration, and rage. That’s our reaction to the hate-filled video record that Elliot Rodger left behind. The 22-year-old, believed to have killed 6 people in Santa Barbara this week, left behind a terrible internet trail.

I cannot and will not speculate about the “mind of the killer” in such cases, but I can offer a little perspective on the nature and social context of these acts. This sometimes entails showing how mass shootings (or school shootings) remain quite rare, or that crime rates have plummeted in the past 20 years. I won’t repeat those reassurances here, but will instead address the bald-faced misogyny and malice of the videos. It outrages us to see a person look into a camera and clearly state his hatred of women — and then, apparently, to make good on his dark promises. It also raises other awful questions. Are these sentiments generally held? If you scratch the surface, are there legions of others who would and could pursue “retribution” as Mr. Rodger did? Is serious violence against women on the rise?

Probably not. Rates of sexual violence in the United States, whether measured by arrest or victimization, have declined by over 50 percent over the last twenty years. As the figure shows, the rape and sexual assault victimization rate dropped  from over 4 per 1000 (age 12 and older) in 1993 to about 1.3 per 1000 in 2012.  And, if you add up all the intimate partner violence (including all rape, sexual assault, robbery, and aggravated assault committed by spouses, boyfriends, or girlfriends), the rate has dropped from almost 10 per 1000 in 1994 to 3.2 per 1000 in 2012. The numbers below include male victims, but the story remains quite consistent when the analysis is limited to female victims.

1 (2)

Of course, misogyny and violence against women remain enormous social problems — on our college campuses and in the larger society. Moreover, the data at our disposal are often problematic and the recent trend is far less impressive than the big drop from 1993 to 2000. All that said, “retribution” videos and PUA threads shouldn’t obscure a basic social fact:  22-year-olds today are significantly less violent than 22-year-olds a generation ago.

Chris Uggen is a professor of sociology at the University of Minnesota and the author of  Locked Out: Felon Disenfranchisement and American Democracy, with Jeff Manza. You can follow him on twitter and at his blog, where this post originally appeared.  Cross-posted at Pacific Standard.

Flashback Friday.

I found this 1917 advertisement for swastika jewelry while browsing through the NY Public Library Digital Gallery. The text reads in part:

To the wearer of swastika will come from the four winds of heaven good luck, long life and prosperity. The swastika is the oldest cross, and the oldest symbol in the world. Of unknown origin, in frequent use in the prehistoric items, it historically first appeared on coins as early as the year 315 B.C.

As this suggests, while the symbol of the swastika is most frequently associated with Hitler and Nazis during World War II, and is still used by neo-Nazi groups, the symbol itself has a much longer history. From wikipedia:

Archaeological evidence of swastika-shaped ornaments dates from the Neolithic period. An ancient symbol, it occurs mainly in the cultures that are in modern day India and the surrounding area, sometimes as a geometrical motif and sometimes as a religious symbol. It was long widely used in major world religions such as Hinduism, Buddhism and Jainism.

Before it was co-opted by the Nazis, the swastika decorated all kinds of things.  Uni Watch has tons of examples. Here it is on a Finnish military plane:

A Boy Scout badge:

A women’s hockey team called the Swastikas from Edmonton (from 1916):

Another hockey team:

In the comments, Felicity pointed to this example:

She writes:

My mom is a quilter and collects antique quilts (when she can afford them). She says that while in general, antique quilts and quilt-tops have gone up a great deal in price over the decades, there’s still one sort you can pick up for a song — swastika quilts.

It’s kind of sad to think of somebody in 1900 putting all that time and hand-stitching into a ‘good luck’ quilt that is now reviled.

All of these examples occurred before the Nazis adopted the swastika as their symbol (and changed it slightly by tilting it on a 45-degree angle). Of course, the original meaning or usage of the swastika is beside the point now. Because it is so strongly associated with the Nazis, it’s impossible to use it now without people reading it as a Nazi symbol. And in fact it’s unimaginable that a group in the U.S. or Europe would use the swastika today without intentionally meaning to draw on the Nazi association and the ideas espoused by Hitler and his party.

Wendy Christensen is an Assistant Professor at William Paterson University whose specialty includes the intersection of gender, war, and the media.  You can follow her on Twitter.

These are not fancy glasses:

1b

They’re celery vases and they’re exactly what they sound like: vases for celery.   In the late 1800s, people used these vases to ostentatiously present celery to their guests. Celery, you see, was a status food: a rare delicacy that only wealthy families could afford and, therefore, a way to demonstrate your importance to guests.

As celery began to decline in importance — cheaper varieties became available and its role for the elite declined — celery vases were replaced by celery dishes.   “Less conspicuous on the dining table,” writes decorative arts consultant Walter Richie, “the celery dish reflected the diminishing importance of celery.”

1a

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Last week the internet chuckled at the visual below.  It shows that, since Godzilla made his first movie appearance in 1954, he has tripled in size.

1

Kris Holt, at PolicyMic, suggests that his enlargement is in response to growing skylines. She writes:

As time has passed, buildings have grown ever taller too. If Godzilla had stayed the same height throughout its entire existence, it would be much less imposing on a modern cityscape.

This seems plausible.  Buildings have gotten taller and so, to preserve the original feel, Godzilla would have to grow too.

But rising buildings can’t be the only explanation.  According to this graphic, the tallest building at the time of Gozilla’s debut was the Empire State Building, rising to 381 meters.   The tallest building in the world today is (still) the Burj Khalifa.  At 828 meters, it’s more than twice as tall as the Empire State Building, but it’s far from three times as tall, or 1,143 meters.

1a

Is there an alternate explanation? Here’s one hypothesis.

In 1971, the average American was exposed to about 500 advertisements per day. Today, because of the internet, they are exposed to over 5,000.  Every. Day.

Media critic Sut Jhally argues that the flood of advertising has forced marketers to shift strategies.  Specifically, he says

So overwhelming has the commercial takeover of culture become, that it has now become a problem for advertisers who now worry about clutter and noise.  That is, how do you make your ads stand out from the commercial impressions that people are exposed to.

One strategy has been to ratchet up shock value.  “You need to get eyeballs. You need to be loud,” said Kevin Kay, Spike’s programming chief.

So, to increase shock value, everything is being made more extreme. Compared to the early ’90s, before the internet was a fixture in most homes and businesses, advertising — and I’m guessing media in general — has gotten more extreme in lots of ways. Things are sexier, more violent, more gorgeous, more satirical, and weirder.

So, Godzilla because, eyeballs.

Cross-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

I guess I’m a little bit of a masochist, so I watched the trailer for Disney’s Cinderella re-make, due out in 2015. All they do is show a shoe, but what a shoe it is! Notice anything different?

1 (2) - Copy

Point to Gail Dines; Pamela Paul; Carmine Sarracino and Kevin Scott; and Kaarina Nikunun, Susanna Paasonen, and Laura Saarenmaa, all who argue that we’re seeing a “pornification” of everyday life.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

How has the distribution of college majors changed? This graph, borrowed from A Backstage Sociologist, shows bachelor’s degrees conferred in the 1970-71 academic year and those conferred 41 years later.

1 (2) - Copy

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

On any given workday, over 31 million lunches are served to children in school cafeterias. Part of the U.S. Department of Agriculture’s (USDA) nutritional assistance efforts, the National School Lunch Program (NSLP) aims to deliver affordable and nutritious meals to the nation’s schoolchildren. After all, food plays a key part in helping them learn, grow, and thrive.

To reach those who need it most, the federal and local governments work together to offer free lunch to children whose parents cannot afford to pay for it. But money is just one way a meal can be compensated for: the ‘free’ school lunch comes at other costs.

First, there are the health costs. At its inception, the NSLP was not designed as a social program. Instead, it was a response to agricultural overproduction and a surplus of farm produce, writes historian Susan Levine. The policymakers’ goal was to get rid of excess foods while supporting domestic production.

As a result, nutrition was of secondary concern to them: one year, eggs would be on the menu daily; another, they would hardly make an appearance. It wasn’t until the war, when politicians grew concerned about the ability of the nation’s men to fight, and until it became apparent hungry children don’t do well in classrooms they were newly required to sit in, that anyone took a serious look at what kids at school were actually eating.

By that time, it was too late. The program was already run like a business, and not even the introduction of nutritional standards helped. Today, these normatives are outdated – children snack rather than eat three square meals, and are less physically active, requiring fewer calories – and almost impossible to follow with the budget restrictions school lunch planners face.

The private industry was quick to offer solutions, but is more interested in profits than schoolchildren’s waistlines. Enriched and fortified chips and candies of otherwise dubious nutritional value appear in school cafeterias and vending machines, often a more popular choice with kids than apples. Frozen and convenience foods are replacing fresh meals cooked on premises. And the labyrinthine regulations of meal calorie contents coupled with cafeteria financial realities often mean adding more sugar to students’ plates is the only thing that can bring down its fat content, for example.

The food itself is not the only factor contributing to children’s undesirable health outcomes. Economist Rachana Bhatt finds the amount of time students have to enjoy lunch also matters. Students tight on time – they must squeeze all getting to the cafeteria, standing in line, eating their food, and cleaning up into their lunch break – might choose to skip the meal, leading them to overeat later, or eat quicker, leading them to consume more due to the delay in feeling full. Even if all school lunches offered healthy options, time would complicate their relationship with health outcomes: Bhatt found students who had less time for lunch were more likely to be overweight.

The lunch may be free when children choose their meal and sit down to eat it, then. But it may come at a substantial cost several years down the line, when a young adult is paying for diabetes medication and visits to the doctor to monitor their blood pressure.

Read Part II of “No Such Thing as a Free School Lunch.”

Teja Pristavec is a graduate student in the sociology department, and an IHHCPAR Excellence Fellow,  at Rutgers University. She blogs at A Serving of Sociology, where this post originally appeared. Cross-posted at Pacific Standard.