culture

Photo by William Brawley, Flickr CC
Photo by William Brawley, Flickr CC

Employees under strict attendance policies face a difficult choice when they are “technically” physically able to be present at work, but may not feel healthy enough to perform their job well. Debating whether or not to call in for the day, employees ask themselves not only if they feel sick, but if they seem sick enough to convince their superiors and coworkers. Talcott Parsons’ classic work on “the sick role” helps us understand why. Sickness inhibits a person’s ability to perform as others expect them to. However, people in the sick role are excused if their symptoms seem to be beyond their control and if they try to get better. Whether or not a person is really sick, taking on the sick role requires those around them to be convinced, granting the sickness legitimacy. Social science research shows how the ease of attaining the legitimated sick role differs depending on gender and class.

More recent and critical research shows why taking the plunge and calling into work can be so difficult. First, a person has to ascertain whether they are truly sick by analyzing how their body feels, and whether or not certain symptoms constitute true illness in the eyes of others. The dripping nose and general malaise of a cold, for example, is never pleasant. Perhaps as children we might think it truly disrupts our daily routines. As we grow older, however, we learn what sociologists of health call illness behavior, which is how an individual interprets specific bodily symptoms (like those of a cold) and reacts to them. Adults learn that the proper reaction to a cold is taking some over-the-counter medicine and heading into work with a box of tissues. This means that learning to interpret the way your body feels is in large part a social process.
Many working-class jobs perpetuate “toughing it out” illness behavior; employees often attribute moral value to being “hard working” and going into work no matter what. This comes with a set of beliefs about what constitutes real illness and what is mere laziness. Research finds that this kind of labor market shapes working mothers’ illness behavior. After developing a worker identity, working mothers often recognize physical symptoms as relatively unimportant compared to building a good reputation with their superiors and defending themselves from job insecurity. The economy, then, is at play when they assess how they physically feel. They then encourage their children to “tough out” common health problems. This learned behavior does not end when a child leaves home either – they are socialized into this practice and are likely to continue “toughing it out” when they are adults.
Photo by fightlaunch, Flickr CC
Photo by fightlaunch, Flickr CC

They say there is a first time for everything, and after 20 long years, the time has finally arrived. With the state of New York’s legalization of mixed martial arts, New York City will host UFC 205, which has been deemed the greatest fight card in UFC history. Headliners Irishman Conor Mcgregor and Joanna Jedrzejczyk of Poland are expected to attract large numbers of attendees and evoke an emotional connection among fans. Events such as the World Cup and Superbowl are cultural symbols, and UFC 205 is shaping up to be no different.

The UFC is a major attraction in sports right now. The urge to emotionally connect with others in an excited crowd is a compelling reason for buying tickets to major sporting events like this. People who attend a big match-up often experience eustress — positive forms of stimulation that lead to elevated levels of excitement.

Randal Collins. 2004. “Interaction Ritual Chains”. Pp. 75-90 in Micro-Sociological Analysis. 3rd edition, Contemporary Sociological Theory, edited by Craig Calhoun, Joseph Gerteis, James Moody, Steven Pfaff, and Indermohan Virk. John Wiley & Sons, Ltd.

Daniel L. Wann, Merrill J. Melnick, Gordon W. Russell, Dale G. Pease. 2001. Sports Fans: The Psychology and Social Impact of Spectators. New York, NY: Routledge.

Porri and Billings have found that people are drawn to sporting events that differ from traditional, mainstream sports like baseball and football. MMA athletes’ unique personalities and set of combative skills generate interest in a different way than team sports do. Social acceptance and mass popularity of attending an event also influences an individual’s decision to splurge and see the big fight. The bigger the event, the greater the desire to be part of the experience, especially if friends and family think it would be cool to attend.

Seungmo Kim, T. Christopher Greenwell, Damon P.S. Andrew, Janghyuk Lee, and Daniel F. Mahony. 2008. “An Analysis of Spectator Motives in an Individual Combat Sport: A Study of Mixed Martial Arts Fan.” Sport Marketing Quarterly 17: 109-119.

Sarah Porri and Andrew C. Billings. “No Limits: Sensation Seeking and Fandom in the Sport Culture of the X Games.” Pp. 91-100 in Sports Fans, Identity, and Socialization: Exploring the Fandemonium, edited by Adam C. Earnheardt, Paul M. Haridakis, and Barbara S. Hugenberg. Lanham, MD: Lexington Books

As the sport of mixed martial arts continues to grow, so does the research on this new sport that is captivating spectators. Other lines of research into MMA investigate the reworking of masculine identity and reasons why participants choose to participate, whether it is for competition or for health reasons. 

Kyle Green. 2015. “Tales from the Mat: Narrating Men and Meaning Making in the Mixed Martial Arts Gym.” Journal of Contemporary Ethnography 45 (4): 419-450.

Photo by Jon S, Flickr CC
Photo by Jon S, Flickr CC

The ways that non-Western victims of violence and poverty are portrayed in the news is problematic. For example, on the 6th of October this year, The New York Times had an above-the-fold image of migrants on its front page. The image was of several dead and dying African migrants on a boat and, troubling as this may be, the image was not an anomaly. Consider the images we have recently seen from Syria — from the drowned child on the beach to the dazed child covered in dust pulled out of a bombed building. Social scientists explains how the choice to use these kinds of images is neither an objective nor an accidental process.

News images are rarely meant to teach us something new, rather, they are meant to reaffirm what we already know while tugging at our heartstrings. Nowhere is this more evident than during instances of instability and violence in the Global South. Even in death and suffering, non-Western victims are denied their privacy; their pain is meant to be consumed by the audience while reaffirming real and symbolic differences.
Images of pain and suffering are less about an increase in “bad” things happening and more about how  we understand the consumption of pain, suffering, and death of victims that are “Other.” They allow us to consume the pain of others from the comfort of our living rooms while reminding us of how “good” we have it.
In the case of Africa and Africans especially, the use of images has a long and troubled history. Research continually shows that images of Africans are often steeped in stereotypes of Africans as simplistic, tribal, “noble savages,” and primitive.
The defining images of 1960s Africa are of starving Biafran children. The image of the 1990s is that of a vulture stalking an emaciated Sudanese child near the village of Ayod in South Sudan by Kevin CarterSuch images often reaffirm stereotypes of the continent and its peoples as ‘starving’, ‘chaotic’, or ‘sick’. This history makes it possible to plaster images of dead and dying migrants on a boat across the front page of an American newspaper with little to no discussion of the structural factors leading to their deaths.
Photo by Andres Juarez, Flickr CC
Photo by Andres Juarez, Flickr CC

Marvel’s new series focusing on superhero Luke Cage debuted on Netflix in late September to critical acclaim. The show boasts a 95% rating on RottenTomatoes and was called “one of the most socially relevant and smartest shows on the small screen you will see this year,” by Deadline.com’s Dominic Patten. Aside from its artistic merits, commentaries also praise the prominence of Luke Cage as a “bulletproof black man in a hoodie,” with the show’s star Michael Colter telling The Huffington Post: “It’s a nod to Trayvon, no question … Trayvon Martin and people like him. People like Jordan Davis, a kid who was shot because of the perception that he was a danger. When you’re a black man in a hoodie all of a sudden you’re a criminal.”

Comic books and comic book culture have slowly become more diverse as companies like Marvel have begun prioritizing the inclusion of racial minorities in their stories. Kamala Khan, a Muslim teen, has replaced the white hero Carol Danvers as Ms. Marvel. The hero replacing Iron Man is a black teen named Riri Williams. And Miles Morales, a black Hispanic teen, replaced the white Peter Parker as Spider-Man. Yet despite its recent progressive slant, Marvel and other comic companies have had issues with racial stereotyping, particularly with their black heroes. Marc Singer describes how the medium of comics relies on racialized representations, with appearance being a major way to distinguish characters from one another. 
This is also heavily tied up in the portrayal of superheroes as super-masculine. When the racial aspect of this dynamic is uncovered, we see a complicated history. Rob Lendrum traces these heroes to the “blaxploitation” era of film/media in the 1970s, arguing that many superheroes were influenced by this culture, including Luke Cage. Jeffrey A. Brown sees these images as one-note and compares them to the black-owned works of Milestone Media Inc. comics.
Nasty Woman Tote Bag
Nasty Woman Tote Bag

Donald Trump’s “nasty woman” comment during the third presidential debate has ignited a veritable “nasty woman economy.” Just two weeks later, there are numerous hashtags and a growing diversity of merchandise, including a tote bag, that reclaim “nasty woman” as a positive and empowering label. Elizabeth Warren capitalized on this at a recent Clinton rally when she said, “nasty women are tough, nasty women are smart, and nasty women vote.” As The Atlantic details in their feminist history of the word, “nasty” was reappropriated as a “badge of honor” some time ago, and they point to songs like Janet Jackson’s Nasty as an example of women using the word in a positive way. The reappropriation of stigmatized labels is not new, though social scientists find that this strategy has both strengths and weaknesses.

Psychologists have found that when a group reclaims a derogatory label, perceptions of that group’s power increases. And once a group is perceived as powerful, individuals feel more empowered to self identify with that reappropriated label. However, this strategy only works for derogatory terms like “queer” and “bitch,” not for descriptive terms like “woman” or majority-group terms like “straight.”
Some sociologists argue that this power is merely a “false power.” The fact that terms like “bitch” are still sometimes used as derogatory terms, often by the very people who claim to be reappropriating them for good, leads some to the conclusion that reclaiming terms in this way only hides oppression by making it acceptable and keeping the term alive in the lexicon. Scholars like Mariam Frasier also point out that class, race, and gender inequality shapes if and when someone can identify with a reappropriated label.
This contested and often flexible nature of reappropriated labels is what others see as their strength. Generational and political differences often result in conflicts surrounding reappropriation of a term. These debates have been found among many groups, including feminists, atheists, and African Americans. But some social scientists argue that these negotiations and disagreements give members of stigmatized social groups the agency to evaluate their own labels and to make determinations about when and whether to accept or reject them on their own terms.
Photo by niteprowl3r, Flickr CC
Photo by niteprowl3r, Flickr CC

In 1990, the popular rock music scene was in total disarray — not a single rock album topped the charts. By January of 1992, Nirvana’s Nevermind surpassed Michael Jackson’s Dangerous to the top spot of the Billboard 200, transforming the state of rock music forever and defining the 90s teen generation. On its recent 25th anniversary, the album remains one of the highest selling rock albums of all time, and is thought to represent not only a shift in music, but in commercial entertainment as well.

Sociologist Ryan Moore notes that the rise of Nirvana and other “grunge” bands demonstrated to major music labels that notions of anti-corporatism, rebellion, and authenticity could be co-opted into a larger marketing campaign to sell a variety of products to youth. Bands like Nirvana were so successful because they personified a collective feeling in the 1990s and once advertisers and marketers capitalized on this notion, expressions of deviance permeated mainstream culture.
Why did bands like Nirvana resonate so well with teenagers during the 1990s? Musical tastes can serve as a form of identity construction and the exclusivity of a collective of people, and Nirvana’s image of rebellion was a resource for youth to distinguish themselves from other generations. Although grunge developed originally as an avant-garde or experimental genre, once it seeped into the local music scene of Seattle and evolved into its industry form, it was already well established in the collective identity of youth in the United States.  
Why do albums like Nevermind still resonate in the modern day? Research shows that people tend to view their memories from adolescence as especially important. At the same time, representations of major events or famous people change or develop with each new generation. Abraham Lincoln, for example, was commemorated as a “self-made man” in the years following emancipation; yet, in the rise of the civil rights movement, newer generations viewed him as the “Great Emancipator.”  Thus, we can expect Nirvana’s significance to be much different for teens today than in the early 1990s.
2014 FIFA World Cup in Brazil. Photo by paulisson miura, Flickr CC
2014 FIFA World Cup in Brazil. Photo by paulisson miura, Flickr CC

Done and dusted. Brazil’s run of hosting global sporting events has officially ended. From the opening game of the FIFA Confederation Cup in 2013, to the closing ceremony of the Paralympics on September 18, it has been quite a ride. Spectacles of the grandest of scales were to be portrayed on television with mostly smiles and laughs, but also some sadness. In the eyes of the world, Brazil’s sporting exploits look to be a success, but was it worth it?  Was the estimated $30 billion the Brazilian government spent hosting these events a good investment? It is probably too early to say for sure, but sociological research can give us a sense of what kind of analyses need to be done to find out. 

In the past, economic growth has often been seen as the primary reason for hosting major sporting events for developing nations. But when even economists acknowledge that little economic revenue will be produced, we must look to other rationales. The hope of breaking into the upper-echelon of nations with positive news coverage and prestige has emerged as one of the justifcations for hosting global sports spectacles today. For example, South Korea, who co-hosted the 2002 World Cup, used the event to promote itself as a modern state, just like its neighbors Japan and China.

Other countries like Brazil, China, and Russia have used their recent Olympic and World Cup events to help build positive public opinion around the globe. Who knows if Brazil’s exploits will make a lasting impact on the world stage, but this criteria will surely be among the most important in how these Games are judged in the future.

Photo by Ryan Godfrey, Flickr CC
Photo by Ryan Godfrey, Flickr CC

During a political season in which very little has gone according to script, one thing has been fairly predictable: the demand on all sides for “media objectivity.”

Advocates for objective political reporting are typically referring to journalistic conventions that include using direct quotes, presenting “both” sides of the story, and focusing on the presentation of “material facts.” These facts, we are often told, speak for themselves. But as intuitive and appealing as the call for neutral, unbiased reporting might sound, however, sociologists have been both cautious and critical.

One reason for sociological skepticism is that the notion of objectivity in political journalism is actually a fairly recent historical invention. It has less to do with balance or fairness than it does with ritualized procedures journalists use to protect themselves from the pressures they face in the day-to-day reporting of complex issues. Objectivity, in this sense, emerged as a kind of protective blanket for political journalists.
Not only are the ritualized practices of objectivity in political journalism relatively new, sociologists have shown that they are fraught with problems and limitations. For example, basic standards of media objectivity are typically less consistently applied to female political candidates and candidates of color.
Another strand of sociological scholarship suggests that most standards of objectivity are strongly linked to social context, personal experiences, and the types of conversations that people have with their peers. In other words, journalists and media organizations tend to define objectivity in relation to their target audience and frame their coverage to appease this group. This approach suggests that although MSNBC and Fox News typify the seemingly bifurcated nature of political journalism in the United States, they epitomize two sides of the same coin and may represent the “new normal” in political journalism.
The new anti-bullying emoji.
The new anti-bullying emoji.

Fans of the movie Mean Girls will vividly recall the scenes when Regina George’s friends banish her from the lunch table for wearing sweatpants and when she distributes the hurtful pages of the “Burn Book” through the halls of the school. Other movies such as Heathers, Carrie, and Dazed and Confused portray how kids at school can be cruel. However, there are some new measures being taken to curb bullying, both in person and online. A new app aimed to help bullied students find a friendly place to sit in the cafeteria has launched just in time for National Bullying Prevention Month. And there is also a new emoji you can use when you witness bullying online. 

It is estimated that over 3 million, or 30%, of middle and high school students experience bullying each year.  Not surprisingly, Nansel and colleagues find that poor relationships with classmates and loneliness are associated with being bullied. Research from Miller shows that much of what teen girls call “drama” is actually bullying, although they tend to understand it as a regular part of life rather than bullying.  Girls’ bullying behavior is more likely to involve spreading sexual rumors, slut-shaming, and dishing out homophobic labels and is less likely to involve physical violence.
Who gets bullied is tied closely to status in the social hierarchy, but not in a way most people expect. Faris and Felmlee find that youth with higher statuses and more network ties, the popular kids, are more likely to face bullying; that is, until they reach the very top of the social pyramid where they find a sort of immunity to bullying.  Rather than the popular mean girl picking on the nerd, bullying is more likely to happen within friend groups, particularly online. Attacks online may happen more frequently between friends or former friends because of competition around romantic partners.
Photo by Lee Coursey, Flickr CC
Photo by Lee Coursey, Flickr CC

Last month marked the centennial of the National Park Service, which is tasked with preserving natural and cultural resources and protecting outdoor spaces for recreation, like Yellowstone, the Grand Canyon, and Yosemite. The most recently designated park is an ocean park where 4,900 square miles of deep sea volcanoes and canyons in the Atlantic ocean are now prohibited from commercial fishing and other types of resource extraction. While the idea behind the national park system is that everyone should be able to enjoy nature, the reality is that the working class and people of color are less likely to use national parks and the history of the parks has involved the displacement and exclusion of Native American, African American and immigrant communities.

Unequal access to resources – including money for entrance fees and transportation, equipment for exploring the parks, and leisure time – have resulted in race and class differences in who can actually enjoy the national parks.
Beyond access, there are a variety of cultural definitions of “the wilderness,” “the outdoors,” and recreation that are shaped by race. Racial norms and ideologies impact how people perceive leisure time and values of natural beauty, and activities like hiking and camping are often seen as “white hobbies.” Yet, these differences are largely due to a history of exclusion, discrimination, and segregation that kept people of color from using public outdoor space, particularly in the Jim Crow South.
The parks themselves were created through colonialism, as much of the land that is now “protected” was of course taken from Native Americans. The idea of a pristine wilderness is historically linked to white racial purity and the need for Europeans to save the land, which justified U.S. expansion into the West. The conservation movement was also led by white men, such as John Muir, who often overlooked the struggles of racial minorities and issues of equity.