Uncategorized

Video imagery courtesy of canva, canva free media usage.

Originally posted on March 16, 2017

The United States and the United Nations have had a closely intertwined relationship since the organization’s founding in 1945. The UN deals with a broad range of issues around the globe, and its widespread influence is often controversial. However, the influence of the United Nation continues to be instrumental in promoting crucial human rights causes, and the reach of its aid is arguably beyond compare. Despite its numerous shortcomings, the UN plays a crucial role in promoting human rights norms across the globe.

Throughout the 1990s in particular, the United Nations took on a central role in the global justice process. It organized and funded international courts following episodes of mass violence, such as the International Criminal Tribunal for Rwanda, and it made indictments for egregious crimes possible for the first time (including the crime of genocide).  Sociologists find that the existence of these courts have a global impact in providing justice, and the trials seem to have a positive effect in reducing human rights violations in the long run.
The judicial process alone cannot adequately address global human rights issues — humanitarianism and diplomacy also play key roles. The United Nation arguably plays the most dominant global role in these initiatives, with monumental campaigns addressing topics like hunger, refugee needs, and climate change. The UN has been criticized for showcasing Western ideals and not taking into account cultural contexts, such as early endeavors to reduce female genital cutting. However, the UN has made improvements and when programs are approached as an opportunity for partnership and not dominance, the outcomes can be quite positive. For example, the agency has taken great strides in promoting gender equality and access to education.
Image description: A blonde woman sits in a church pew, facing away from the camera. Image courtesy of pixabay, pixabay license.

This October, Pope Francis is kicking off a three year synod, assembling leaders and laypeople to discuss issues of church doctrine and practices. One big question on the table: should the Catholic church ordain women women as deacons? Driven in part by a growing movement of women around the world who feel called to ordination, the case for ordaining women will likely be to be one of the most hotly debated issues among Catholics worldwide. 

After hundreds of years restricting the role of women in church leadership, how did the Church even get to this point? The story begins with changes in mainstream culture. Historically, changing norms around sex and gender have encouraged church leaders to reexamine their existing doctrines, particularly if church participation is declining. As  mainstream culture changes, religious institutions face the challenge of “retraditioning” themselves for the future: adjusting their doctrines and practices to better align with changing mainstream culture. Religious leaders then debate proposed changes to church doctrines and practices–exactly the point that the Catholic Church is at today with the upcoming three-year synod.

What will happen at the end of the synod in 2024? Historical research suggests that after church leaders begin debating ideologies, changes to church policy often come through sheer luck, force, or the influence of powerful personalities. Only time will tell if the church leaders will have responded to the calls of these Catholic women.

Image: A drag queen, dressed in a rainbow-sequeened dress and pink wig stands with arms raised, smiling. Image courtesy of Dany Sternfeld, CC BY-NC-ND 2.0.

In June, in celebration of pride month, members of LGBTQ+ communities and allies honored and reflected on hard-fought advancements for queer people, from civil rights like marriage equality and employment protections, to representation in positions of political prominence and mainstream culture. One area of change is the rising popularity of drag — an artform pioneered by queer people of color in clandestine ballrooms, now occupying prominent positions in gay bars, television competition programs, and mainstream films.

Drag began, like many parts of queer life, underground in urban nightlife spaces. In queer havens like San Francisco and New York City, drag performers have graced nightclub stages for over a century. As homosexuality grew more visible in the late twentieth century, drag performers were at the forefront of battles for liberation, political rights, and, later, for medical treatment during the years of the AIDS pandemic. In the 1960s, drag queens in Los Angeles and San Francisco pushed back against run-ins with law enforcement. Some in the queer community believe a drag queen, Marsha P. Johnson, threw the first brick at the Stonewall Inn in 1969, kicking off the now-infamous confrontation with the NYPD. In the years that followed, drag queens blended campy performance with activism, protesting governments unresponsive to those dying from AIDS, drug manufacturers, and anti-same sex marriage advocates. 

Today, in light of the LGBTQ+ community’s social, political, and legal advances, drag enjoys an unprecedented prominence in mainstream culture. Drag Story Hours, where performers, or queens, read storybooks to children are commonplace in schools and libraries in twenty six states and Puerto Rico. Performing drag, once a marginalized profession, is now a viable, if precarious, job prospect. The RuPaul’s Drag Race franchise boasts thirteen seasons (with six All-Stars seasons to boot) as well as international spin-offs in seven countries. The cultural significance and prominence of drag today raises questions: How does drag celebrate queerness and resist normative sexuality? How did drag find its footing in both pop culture and political circles? Significant research in the humanities and social sciences sheds light on this. 

Performing Gender

Gender theorists have argued that gender is a performed identity, reproduced in daily social interactions. Like other social categories, gender is shaped by, and reshapes, relations of power. Drag involves stylistic and exaggerated gender performance. Drag queens were initially male-identifying performers who, unlike “crossdressers,” relied on exaggerated and parodied gender performance to both entertain and draw attention to political causes. Given the queer community’s social exclusion, drag performers have for decades formed closely knit communities, or “houses”, of mutual support and solidarity for performers often cut off from traditional familial networks. 

Subverting Gender: Transgressive Tactics

Drag’s most enduring social impact has been calling into question popular conceptions of gender. Drag draws attention to the important differences between sexual orientation and gender, as well as internal gender identity, external performance, and biological sex, topics of ongoing discussion in academic and activist communities.
Drag queens use cultural tools like language and physical appearance to subvert and perform gender identities. Through parodying and imitating mainstream gender norms, drag queens reveal the arbitrariness, cultural origins, and performance inherent to all gender identities. In these communities, queens have developed coherent group identities by creating particular speech patterns and unique cultural cues. Values like not being too competitive or “hungry,” maintaining “sisterhood,” and exuding professionalism and humility are reinforced through language and cultural norms. Drag entertainers draw on appearances and practices situationally, in some cases displaying feminine sides in interactions with men while reverting to their masculinity in situations that call for it. Lesbian drag kings – female identifying performers presenting as men – similarly subvert gender roles by drawing on masculine practices in performance and, in some cases, more feminine practices in intimate settings.
While drag’s prominence today has prompted debates on gender norms in the mainstream, it has, at the same time, led to criticism about some harmful aspects  of drag performance, including caricaturing racial minorities and marginalized groups. 

Art Form as Resistance

Since drag first became a commonplace – if clandestine – staple at gay bars and clubs, the performances involved an inherent critique of dominant gender norms, presentation, and behavior. This resistance owes much to the repression and marginalization queer performers have faced in many aspects of their lives. Homophobic views often forced performers out of their homes, leading them to build bonds and kinship networks with other queer people in more accepting urban locales. Under precarious conditions, performers build community with other marginalized queer individuals and crafted a trangressive art form now seen as a cultural staple.  Drag’s rise would not have been possible with changing gender norms and styles of self-expression. Birth control became publicly available in 1960, opening new possibilities for women beyond the home. In the postwar decades, artists like Esquerita, Little Richard, and Sylvester pushed the limits of accepted gender presentation, normalizing new portrayals of gender. norms in their revolutionary performances., Cultural change was already well underway by the time the Stonewall riots kickstarted the national queer liberation movement. While evolving gender norms and the cultural movements of the 1960s did help the cause of queer liberation, fractures among LGBTQ+ activists kept drag remained in a marginal position within the movement.
Image: Three white-appearing healthcare workers, “Thank you – You are our heroes” courtesy of 18371568 via pixabay CC0.  This imagery suggests our heroes are white, even though around 25% of nurses in the U.S. are people of color. Furthermore, signage that says we “thank our heroes” does not match up with how frontline workers have been unsupported by leadership. Images like this mask structural inequality (pun intended) under the guise of all being “in this together.”

We have seen many things described as “unprecedented” as the year 2020 has steamrolled over many of us. Among them, the pandemic has given the world an unprecedented illustration of U.S. racial inequalities. For example, Black people are more likely to die from COVID-19 infections than are people in any other racial group, and this is true even after controlling for income, housing conditions, and underlying health conditions. Yet not all Americans are able to see the racial inequalities that have been unmasked.

Sociologist and race scholar Eduardo Bonilla-Silva insists that the key to understanding race and racism in the United States is understanding how colorblind ideals shape Americans’ thinking and public discourse. Examples of what Bonillia-Silva calls color-blind racism are phrases such as “We are all in this together” or “Covid is the great equalizer” because they serve to draw attention away from the racial disparities that are otherwise so persistent and pronounced.

Color-blind racism is named after the hypothetical White observer who says they “do not see color” while they, simultaneously, fail to see existing racial inequalities. In other words, colorblind framings mask deep, structural inequalities. People may feel like they are saying unifying things with these tropes, but this sort of “all in this together” messaging serves to hide the structural nature of racism.

Even more, colorblind racism tends to minimize racism itself and, when confronted with racial injustices, constructs and accepts elaborate race-based explanations for racial inequality. For example, within a color-blind racism frame, Latinx workers might be said to be paid less than White workers because they do not work as hard, are unreliable as workers, or are less qualified. And White workers are said to get more raises because they are smarter and work harder. With racial blinders on, anything that results from structural causes is explained by deficiency in the minoritized party, and coincidental superiority in the privileged party. This negates the structural origins of inequality and allows the status quo to continue.
In terms of the COVID-19 mortality rate, the sometimes spoken explanation (i.e. 1, 2, 3) is that Black people must be weak, prone to illness, or make unhealthy choices in general. That shift in focus, from talking about racial inequality in the mortality rate associated with a virus to, somehow, talking about Black people as deficient, weak, sick, and making poor choices, illustrates how color-blind racism is alive and well amidst this pandemic. Colorblind racism serves as a mask, preventing the public from seeing the structural causes of health disparities experienced by Black people and other people of color.
Three generations of an Asian family sit together on a couch, smiling up at a camera. Image via Anton Diaz, CC BY-NC 2.0.

With new covid cases at an all-time high, coronavirus is front and center in the minds of many Americans. The Centers for Disease Control also recently published a report that indicates that household transmission of covid is frequent between both adults and kids. With the rise in covid infection, and concern for household transmission, it is worth thinking about who lives together under one roof and why. In particular, who lives in intergenerational houses where young people might expose older adults to the virus?  And how might larger groups of people living together increase the chance of virus spread? Sociological research offers a number of ways to think about the reasons that intergenerational families live together that can inform our answers to these questions and help frame public health responses.

Pew Research center reports that a majority of young adults are living with their parents for the first time since the Great Depression. Over the previous century and a half less and less young people have lived with their parents. However, research shows that intergenerational bonds are of increasing importance. Older adults live longer, increasing the length of shared life among parents and kids, and grandparents and grandchildren. Young adults, particularly in the middle-class, also need to rely on their parents’ financial support through a longer period of “transitioning to adulthood” that includes getting a college education. With a weak labor market, and many college courses online, it is no surprise that many young adults are remaining or returning home.
Although, overall, many more young adults are living with their parents, at least temporarily, there are important racial and ethnic differences in intergenerational households. White families are more likely to offer intergenerational financial support while Black and Latinx families are more likely to help family members by providing housing or help with childcare or caregiving, for instance. Immigrant families are also more likely to live in intergenerational households.

In the context of covid, living in intergenerational families can seem risky. These living arrangements can put older adults in closer proximity with young people who may be leaving home to work each day. However, overall, intergenerational residence patterns are a way that families can share resources and develop resilience in the face of limitations. For immigrant families intergenerational living arrangements can help create networks of support that ease the transition to a new country. For racial minorities living together can be a way to pool money and provide support in the face of structural barriers such as disproportionate poverty or poor health.  The widespread unemployment and disability brought on by the covid-19 pandemic makes these networks of support more crucial than ever.

Firefighter captures an image of a wildfire. Image via creative commons, CC PPM 1.0.

The 2020 wildfire season is the worst on record, with blazes ravaging portions of California, Oregon, Washington and Colorado.

In early October, California officials reported that more than 4 million acres burned across the state this year, more than doubling the previous yearly record from two years ago. The August Complex fire alone surpassed one million acres – larger than the entire state of Rhode Island. Recently, The Cameron Peak Fire became the largest blaze in Colorado state history. The ramifications of these fires go beyond charred grounds, with almost 40 people killed, thousands of homes and billions of dollars in property burned and millions of people exposed to hazardous pollution levels.

A story that is often untold, and lies at the center of these fires, is the story of the men and women putting out the blazes. These firefighters battle long hours, low sleep and high stress. They can even lose track of time when the sun is obscured by smoke.

Research has found that these firefighters struggle with their psychological well-being, leading to increased depression, anxiety, suicidal tendencies and other mental health concerns. Firefighters are exposed to high-risk, low-control situations and regularly deal with death, including the suicide of other firefighters. 
Researcher Matthew Desmond paid his way through college while fighting fires in Arizona. He returned to the profession for some of his early sociological work, finding that firefighters often do not associate their careers with risk – “Risk? What risk?” – and that organizations recruit firefighters by downplaying the risk they will face in the field.
Other research has explored the intersection of punishment and rehabilitation among those in California’s prison fire camps. The findings point out that the fire camps are simultaneously prisons and nonprisons, and those participating are both inmates and heroes.
Many stacks of textbooks. Photo via Pixabay.

Textbooks are more prevalent in American history courses than in any other subject, and a recent article from The New York Times revealed how geography has influenced what U.S. students learn. Despite having the same publisher, textbooks in California and Texas (the two largest markets for textbooks in the U.S.) vary wildly in educational content. Researchers have also found numerous inconsistencies and inaccuracies in American history textbooks, resulting in the glorification of national figures and spread of national myths.

Depictions of violence in textbooks are also highly politicized. Episodes of violence are often muted or emphasized, based on a country’s role in the conflict. For example, conflicts with foreign groups or countries are more likely than internal conflicts to appear in textbooks. Additionally, American textbooks consistently fail to acknowledge non-American casualties in their depictions of war, citing American soldiers as victims, rather than perpetrators of the horrors of war. Depictions of conflicts also vary over time, and as time passes, textbooks move away from nationalistic narratives to focus instead on individualistic narratives.
Public figures, like Hellen Keller and Abraham Lincoln, tend to be “heroified” in American textbooks. Rather than treating these public figures as flawed individuals who have accomplished great things, American textbooks whitewash their personal histories. For example, textbooks overlook Keller’s fight for socialism and support of the USSR and Lincoln’s racist beliefs. The heroification of these figures is meant to inspire the myth of the American Dream — that if you work hard, you can achieve anything, despite humble beginnings.
Symbolic representation of the past is important in stratified societies because it affects how individuals think about their society. Emphasizing the achievements of individuals with humble beginnings promotes the belief among American students that if they work hard they can achieve their goals, despite overwhelming structural inequalities. Furthermore, as historical knowledge is passed down from one generation to the next, this knowledge becomes institutionalized and reified–making it more difficult to challenge or question.
Hand holding a diamond. Photo via pxfuel.

Over one million people will get engaged on Valentine’s Day, and as a result, diamond sales usually uptick around this time. Diamonds are both historical and cultural objects; they carry meaning for many — symbolizing love, commitment, and prestige. Diamonds are highly coveted objects, and scholars have found about 90 percent of American women own at least one diamond. In the 1990s, war spread throughout West Africa over these precious pieces of carbon, as armed political groups vied for control over diamond mines and their profits.

Given their role in financing brutal West African civil wars, diamonds became associated with violence and international refugee crises, rather than financial prosperity and love. Diamonds became pejoratively known as blood diamonds, or conflict diamonds, and consumers became more likely to perceive diamonds as the result of large scale violence and rape.  As a result, major diamond producers have attempted to reconstruct the symbolic meaning of diamonds, turning them into symbols of international development and hope.
As the diamond trade became immoral and socially unjust, new global norms emerged around corporate and consumer responsibility. Non-governmental organizations (NGOs) lobbied for the diamond industry to change their behaviors and support of conflict mines while simultaneously creating new global norms and expectations. In the early 2000s, international NGOs, governments and the diamond industry came together to develop the Kimberley Process — to stop the trade of conflict diamonds. Today, 75 countries participate, accounting for 99% of the global diamond trade. 
Bieri & Boli argue that when NGOs urge companies to employ social responsibility in their commercial practice, they are mobilizing a global moral order. Diamonds provide an example of how symbols, products, and meaning are socially and historically constructed and how this meaning can change over time. The case of blood diamonds also illustrates how changing global norms about what is and is not acceptable can redefine the expectations of how industries conduct business.
A student takes notes by hand. Photo via Wikimedia Commons.

If you believe that taking notes longhand is better than typing (especially that typing leads to verbatim transcription but writing helps with processing ideas), you have probably seen a reference to Mueller and Oppenheimer (2014). We even featured it in a TROT last fall. The Pen is Mightier Than the Keyboard has over 900 citations on Google Scholar, but is also a staple on Twitter, blogs, and news articles. According to Altmetric, it has been cited in 224 stories in 137 news outlets the past two years (two years after it was published), and linked to on Twitter almost 3,000 times. 

But new research suggests that its fame was premature. How Much Mightier Is the Pen than the Keyboard for Note-Taking? A Replication and Extension of Mueller and Oppenheimer argues that research has not yet determined whether writing or typing is categorically better for class achievement. The new study (a direct replication of the original study) did find a slight advantage for those taking notes by hand, but unlike in the original study, the differences were not statistically significant. 
The new study also expanded the original by including a group with eWriters, an electronic form of notetaking that allows students to write on a screen. As our original blog noted, much of the research on laptops in the classroom revolves around their potential to be distractions, and research on notetaking needs to take into account advances in technology that could lead to better notetaking environments as well as assisting students with disabilities. Morehead, Dunlosky, and Rawson find that eWriters, although not in common use, could be a bridge between paper and the distraction-laden environment of laptops. 
A volunteer donates blood. Photo via pxfuel.

In the past year, the American Red Cross issued several statements regarding critical blood shortages in various locations throughout the United States. Blood shortages are not unique to the United States; a recent study by the World Health Organization found 107 out of 180 countries have insufficient amounts of blood. While organizations like the American Red Cross try to remedy blood shortages, sociologists have found that blood shortages are closely related to donors’ feelings of altruism and the existing structures of donor organizations. 

Social psychologists have explained the decision to give blood in terms of altruism, acting in the interest of others, while sociologists tend to explain blood donation in terms of organizations and institutions. Voluntary donations have historically been portrayed as more desirable or as a civic duty, but scholars note that the most common reason for not giving blood is simply not being asked. They also find that personal motivations (such as a general desire to help, sense of duty, and empathy) are more likely to be strengthened with each donation, while external motivations (emergencies, peer pressure, etc.) are likely to decrease over time.
As a result, donation centers have been encouraged not to pay donors due to a fear that this would discourage altruistic givers. Paying donors also raised other concerns, such as the belief that paying donors would encourage exploitative relationships between economically unstable individuals and donation centers. Additionally, there were also fears that paid blood was unsafe blood, as it would motivate high-risk groups to lie about their status for money. 
Altruism is not random or individual, it is driven by institutions. For example, in places where the Red Cross is prevalent, people involved in religious or volunteer organizations donate the most blood. Alternatively, in countries where independent blood banks operate, this is not true. In fact, state systems, according to Healy, tend to have larger donor bases. Thus, the organizational, rather than individual desire to give, largely drives blood donations.