In 1999 Jackson Katz headlined a documentary that powerfully revealed the mask of masculinity, a pretense of stoicism and readiness for violence that many men feel compelled to put on, at least part of the time. The film, Tough Guise: Violence, Manhood, and American Culture, became a staple in classes on gender across the country.
Today marks the release of Tough Guise 2 and SocImages was given the honor of debuting an exclusive clip from the new film. In the segment below, Katz explains that men aren’t naturally violent but, instead, often learn how to be so. Focusing on socialization, however, threatens to make invisible the socialization agents. In other words, Katz argues, men don’t just learn to be more violent than they otherwise would be, they are actively taught.
He begins with the fact that the video game and film industries both take money from companies that make firearms to feature their products. The U.S. military then uses the video game Call of Duty for recruitment and training. It’s no use arguing whether the media, the military, or the gun industry are responsible for rates of violence, he observes, since they’re in cahoots. These extreme examples intersect with the everyday, mundane lessons about the importance of being “real men” that boys and men receive from the media and their peers, parents, coaches, and more.
This update of the original will tell the compelling story about manhood and violence to a new generation and remind older ones of the ongoing crisis of masculinity in America.
The partial U.S. map below shows the proportion of the population that was identified as enslaved in the 1860 census. County by county, it reveals where the economy was most dominated by slavery.
A new paper by Avidit Acharya, Matthew Blackwell, and Maya Sen has discovered that the proportion of enslaved residents in 1860 — 153 years ago — predicts race-related beliefs today. As the percent of the population in a county accounted for by the enslaved increases, there is a decreased likelihood that contemporary white residents will identify as a Democrat and support affirmative action, and an increased chance that they will express negative beliefs about black people.
Avidit and colleagues don’t stop there. They try to figure out why. They consider a range of possibilities, including contemporary demographics and the possibility of “racial threat” (the idea that high numbers of black people make whites uneasy), urban-rural differences, the destruction and disintegration caused by the Civil War, and more. Controlling for all these things, the authors conclude that the results are still partly explained by a simple phenomenon: parents teaching their children. The bias of Southern whites during slavery has been passed down intergenerationally.
This four-minute BBC video documents a population of ethnic German-Americans. They are the descendants of Germans who immigrated to Texas 150 years ago. Over the generations, the language evolved into a unique dialect. Today linguist Hans Boas is trying to document the dialect before it dies out. While it persisted for a very long time, World War II, and the ensuing stigma against anything German, brought an end to its transmission. Today’s speakers are all 60 or older and will soon be gone.
Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
While some austerity advocates really fear (although incorrectly) the consequences of deficit spending, the strongest proponents are actually only concerned with slashing government programs or the use of public employees to provide them. In other words their aim is to weaken public programs and/or convert them into opportunities for private profit. One measure of their success has been the steady decline in public employment. Floyd Norris, writing in the New York Times notes:
For jobs, the past four years have been a wash.
The December jobs figures out today indicate that there were 725,000 more jobs in the private sector than at the end of 2008 — and 697,000 fewer government jobs. That works into a private-sector gain of 0.6 percent, and a government sector decline of 3.1 percent.
In total, the number of people with jobs is up by 28,000, or 0.02 percent.
How does that compare? It is by far the largest four-year decline in government employment since the 1944-48 term. That decline was caused by the end of World War II; this one was caused largely by budget limitations.
The chart below, taken from the same post, also reveals just how weak private sector job creation has been over the past 12 years (compare the top three rows — the presidencies of Obama and Bush — w This graphic from the New York Timeshighlights just how significant the decline in public employment has been in this business cycle compared with past ones. Each line shows the percentage change in public sector employment for specified months after the start of a recession. Our recent recession began December 2007 and ended June 2009. As you can see, what is happening now is far from usual.
It is also worth noting that despite claims that most Americans want to see cuts in major federal government programs, the survey data show the opposite. For example, see the following graphic from Catherine Rampell’s blog post. As Rampell explains:
In every category except for “aid to world’s needy,” more than half of the respondents wanted either to keep spending levels the same or to increase them. In the “aid to world’s needy” category, less than half wanted to cut spending.
Not surprisingly, this assault on government spending and employment will have real consequences for the economy and job creation. All of this takes us back to the starting point — we are talking policy here. Whose interests are served by these trends?
Several factors were in play in the 1920s for the emergence of what came to be known as flappers, teenagers and young women who flaunted convention and spent their time pursuing fun instead of settling down to raise children in the prime of their lives. Many entered college or the workforce and felt entitled to make their own decisions about how to live their lives.
A lot of young men did not return home from World War I, which left an entire cohort of women without enough husbands to go around. The horror of the war (and the Spanish flu pandemic of 1918) also impressed young people with the knowledge that life is short and could end at any moment. Instead of staying home preparing to marry a man who might never come, young women wanted to spend what time they had enjoying all that life had to offer.
Movies popularized the image of the fun-loving and free-thinking woman throughout the US and Europe. The 1920 movie The Flapper introduced the term in the United States. The title character, Ginger, was a wayward girl who flouted the rules of society. Played by Olive Thomas, a former Ziegfeld Girl (left), Ginger had so much fun that a generation of lonely young women wanted to be like her. Another role model was stage and screen actress Louise Brooks (right), who also modeled for artists and fashion designers. She was the inspiration for the flapper comic strip Dixie Dugan.
Clara Bow wasn’t the first flapper on screen, but she was certainly a role model for young women of the era. She didn’t play by the rules, and was tabloid fodder for years for her sexual escapades with the biggest movie stars of the time. Bow’s first film was in 1922 and her career peaked in 1927 with the film It. “It” was defined as the sexual allure some girls have and others don’t. Bow’s fans wanted “it”, so they copied her look and behavior.
The rise of the automobile was another factor in the rise of flapper culture. Cars meant a woman could come and go as she pleased, travel to speakeasys and other entertainment venues, and use the large vehicles of the day for heavy petting or even sex.
These young women had plenty of opportunities for fun. Although Prohibition drove alcohol underground, that only added to its allure. Postwar prosperity allowed for leisure time and the means to spend that time drinking, dancing, and hanging out with free thinkers.
Being a flapper wasn’t all about fashion. It was about rebellion. In this article from 1922, a would-be flapper (but still a “nice girl”) explains her lifestyle choices to her parents. Flappers did what society did not expect from young women. They danced to Jazz Age music, they smoked, they wore makeup, they spoke their own language, and they lived for the moment. Flapper fashion followed the lifestyle. Skirts became shorter to make dancing easier. Corsets were discarded in favor of brassieres that bound their breasts, again to make dancing easier. The straight shapeless dresses were easy to make and blurred the line between the rich and everyone else. The look became fashionable because of the lifestyle. The short hair? That was pure rebellion against the older generation’s veneration of long feminine locks.
The party stopped when the economy crashed and the Great Depression curtailed the night life. Although the flapper lifestyle died along with the Roaring Twenties, the freedoms women tasted in that era weren’t easily given up. They may have gone back to marriage and long hours of toil for little pay, but hemlines stayed above the ankle, and the corset never went back to everyday status. And we’ve been driving cars ever since.
Miss Cellania is a newlywed mother of four, full-time blogger, former radio announcer, and worst of all, a Baby Boomer. In addition to mental_floss, she posts at Neatorama, YesButNoButYes, Geeks Are Sexy, and Miss Cellania. Miss C considers herself an expert on no particular subject at all.
How many Americans think about the war in Afghanistan regularly? The daily realities of war are inescapable for military members and their families, but the rest are largely able to stay disconnected from it. The issues of foreign policy and war dropped off the radar entirely for most Americans before the 2012 election.
Mother Jones Magazine’sWe’re Still at War: Photo of the Day feature is meant to remind Americans that the war is ongoing. Here’s the photo for February 15, taken by Sgt. Jon Heinrich:
About 68,000 U.S. troops are still deployed in Afghanistan, down from the peak of 101,000 (not to mention the those who were in Iraq). Without a draft, WWII-style war bond campaigns, or highly visible war industry, most Americans need to be reminded that we’re at war. Unlike veterans and military families, civilians not directly connected to the military have a kind of privilege to forget the conflict in their daily lives. The result is a growing chasm between U.S. civilians and the Armed Forces.
In 2011, the Pew Research Center surveyed Americans about their connections to the military and found a considerable gap: “Never has the U.S. public been so separate, so removed, so isolated from the people it pays to protect it.” The vast majority of those over 50 had an immediate family member who had served (mostly due to WWII and Vietnam). Of those 30-49 years old, 57% had someone in their immediate family serve. Those between 18-29 are the most disconnected from war; only 33% have a close family member with military experience:
This leads to differences in views about the military. Those from military families are more likely to believe that civilians do not understand what they go through, that the U.S. “is the greatest country in the world,” and that they are “more patriotic than most people in the country.” They’re also more likely to recommend the armed forces to a young person — though only about half would do so:
My own research on the experiences of military families during deployment supports the Pew findings. And patterns in who joins the armed forces may lead to an increased gap. Those with veterans in their family are more likely to join the military; 79% of young veterans, compared to 61% of the public, have family members who served. As fewer Americans have relatives who were in the military, making them less likely to join themselves, insulation from the military grows.
This bumper sticker reflects this gap between military families and everyone else. It draws a distinction between “my” service member and “your” freedom, while seeming to assume a lack of support from non-military Americans:
Military families believe that others don’t understand what they go through during deployment. As one mother told me, “We understand why we try to be strong but automatically cry when we see the foot powder display at Wal-Mart.” Or as an Iraq war veteran explained to Time,
The gap between the military and everybody else is getting worse because people don’t know–and don’t want to know–what you’ve been through…There are no bond drives. There are no tax hikes. There are no food drives or rubber drives … It’s hard not to think of my war as a bizarre camping trip that no one else went on.
Veterans return to a country where very few understand what they have been through, which makes reentry into civilian life more difficult — just one of the consequences of having a small segment of the country assume the burdens of war.
Wendy Christensen is an Assistant Professor at William Paterson University whose specialty includes the intersection of gender, war, and the media.
You might be surprised to learn that at its inception in the mid-1800s cheerleading was an all-male sport. Characterized by gymnastics, stunts, and crowd leadership, cheerleading was considered equivalent in prestige to an American flagship of masculinity, football. As the editors of Nation saw it in 1911:
…the reputation of having been a valiant “cheer-leader” is one of the most valuable things a boy can take away from college. As a title to promotion in professional or public life, it ranks hardly second to that of having been a quarterback.*
Indeed, cheerleading helped launch the political careers of three U.S. Presidents. Dwight D. Eisenhower, Franklin Roosevelt, and Ronald Reagan were cheerleaders. Actor Jimmy Stewart was head cheerleader at Princeton. Republican leader Tom DeLay was a noted cheerleader at the University of Mississippi.
Women were mostly excluded from cheerleading until the 1930s. An early opportunity to join squads appeared when large numbers of men were deployed to fight World War I, leaving open spots that women were happy to fill.
When the men returned from war there was an effort to push women back out of cheerleading (some schools even banned female cheerleaders). The battle over whether women should be cheerleaders would go on for several decades. Argued one opponent in 1938:
[Women cheerleaders] frequently became too masculine for their own good… we find the development of loud, raucous voices… and the consequent development of slang and profanity by their necessary association with [male] squad members…**
Cheerleading was too masculine for women! Ultimately the effort to preserve cheer as an man-only activity was unsuccessful. With a second mass deployment of men during World War II, women cheerleaders were here to stay.
The presence of women changed how people thought about cheering. Because women were stereotyped as cute instead of “valiant,” the reputation of cheerleaders changed. Instead of a pursuit that “ranks hardly second” to quarterbacking, cheerleading’s association with women led to its trivialization. By the 1950s, the ideal cheerleader was no longer a strong athlete with leadership skills, it was someone with “manners, cheerfulness, and good disposition.” In response, boys pretty much bowed out of cheerleading altogether. By the 1960s, men and megaphones had been mostly replaced by perky co-eds and pom-poms:
Cheerleading in the sixties consisted of cutesy chants, big smiles and revealing uniforms. There were no gymnastic tumbling runs. No complicated stunting. Never any injuries. About the most athletic thing sixties cheerleaders did was a cartwheel followed by the splits.***
Cheerleading was transformed.
Of course, it’s not this way anymore. Cultural changes in gender norms continued to affect cheerleading. Now cheerleaders, still mostly women, pride themselves in being both athletic and spirited, a blending of masculine and feminine traits that is now considered ideal for women.
In the 1900s and 1910s, gun advertising frequently simply touted the benefits of the gun itself, ignoring completely any indication as to what the gun was for:
In the ’20s and ’30s, gun advertising more frequently involved a hunting or pest-reduction theme:
This theme continued through the 40s, but alongside a new theme, war (i.e., World War II):
Then, in the 1960s, the war theme disappeared and the hunting theme continued, this time with a new twist. Instead of just hunting for food (and sport) or to protect your property, ads included the hunting of exotic game solely for sport: