inequality

A young girl blows out a birthday candle with help from her grandmother and brother. In times of economic recession, many individuals experiencing hardship receive help from extended family members. Photo via Wallpaper Flare.

A few weeks ago, the Dow Jones fell 20% from its high. To many this is an indicator of great uncertainty about the future of the market and could be an indication of a coming recession. And this was even before the crisis of COVID-19. In these uncertain economic times, we look back to how the Great Recession of 2008 impacted families’ decision-making including whether or not to have children, where to live, and how much to rely on family members for financial support. Perhaps this recent history can help us imagine what might lie ahead.

Demographers and other social scientists are interested in the fertility rate, or the number of live births over the lifetime of a child-bearing woman. After the Great Recession, the fertility rate fell to below replacement rate (or 2 children per every one woman) for the first time since 1987. Scholars attribute this change in fertility to increased economic uncertainty; people do not feel confident about having a child if they are not sure what will come next. In fact, the fertility rate fell lowest in states with the most uncertainty, those hit hardest by the recession and “red states” concerned about the economic future under Obama.
  • Schneider, Daniel. 2015. “The Great Recession, Fertility, and Uncertainty: Evidence From the United States.” Journal of Marriage and Family 77(5):1144–56.
  • Guzzo, Karen Benjamin, and Sarah R. Hayford. 2020. “Pathways to Parenthood in Social and Family Contexts: Decade in Review, 2020.” Journal of Marriage and Family 82(1):117–44.
  • Sobotka, Tomáš, Vegard Skirbekk, and Dimiter Philipov. 2011. “Economic Recession and Fertility in the Developed World.” Population and Development Review 37(2):267–306.
During the Great Recession there was also an increase in the number of young adults, both single and married, living with their parents. Rates of both young married adults living with their parents increased in 2006 to reach 1900 levels which is surprising considering that the century between 1900 and 2000 is considered the “age of independence,” when many more young people moved out and established households of their own. This effect was particularly strong for young adults with less education and those who had fewer resources to weather the storm of the recession on their own.
The economic challenges of the late 2000s also may have led to an increase in interpersonal conflict within families. In part, this may stem from the pressure family members feel to serve as a safety net for relatives who are struggling financially. For instance, Jennifer Sherman found that some individuals who were experiencing financial hardship withdrew from their extended family during the Great Recession rather than asking for support. This, along with findings that giving money to family members during the recession increased an individual’s chance of experiencing their own financial stress, raises questions about whether or not family networks can offer support in times of economic turmoil.
  • Sherman, Jennifer. 2013. “Surviving the Great Recession: Growing Need and the Stigmatized Safety Net.” Social Problems 60(4):409–32.
  • Pilkauskas, Natasha V., Colin Campbell, and Christopher Wimer. 2017. “Giving Unto Others: Private Financial Transfers and Hardship Among Families With Children.” Journal of Marriage and Family 79(3):705–22.
As the economic effects of covid-19 are felt across the country, many Americans are preparing for another severe economic downturn. Understanding how the Great Recession influenced family structure and life is an important lens for considering how large economic events shape people’s everyday lived experiences.
Flyers at Hartsfield-Jackson Atlanta International Airport wearing facemasks. Photo by Chad Davis, Flickr CC

During times of crisis, existing prejudices often become heightened. Fears about the current coronavirus, or COVID-19, have revealed rampant racism and xenophobia against Asians. Anti-Asian discrimination ranges from avoiding Chinese businesses to direct bullying and assaults of people perceived to be Asian. This discriminatory behavior is nothing new. The United States has a long history of blaming marginalized groups when it comes to infectious disease, from Irish immigrants blamed for carrying typhus to “promiscuous women” for spreading sexually transmitted infections. 

Historically, the Chinese faced blame time and again. In the 19th century, public health officials depicted Chinese immigrants as “filthy,” carriers of disease. These views influenced Anti-Chinese policies and practices, including humiliating medical examinations at Angel Island — the entry port for many Chinese immigrants coming to America — and the violent quarantine and disinfection of San Francisco’s Chinatown in the early 20th century when a case of the Bubonic plague was confirmed there. 
An advertisement for "Rough on Rats" rat poison. On the flyer there is an image of a stereotypically drawn "china man" eating a rat.
Late 19th century racist advertisement for rat poison

Discrimination against the Chinese is one example among many. Such discrimination had nothing to do with their actual hygiene and health, and everything to do with their social position relative to other racial groups. It’s easy to look back on the xenophobic U.S. policies and behavior in earlier times. Let’s not fall into the same patterns today.

For more on xenophobia and coronavirus, listen to Erika Lee on a recent episode of NPR’s podcast, Code Switch.

Many stacks of textbooks. Photo via Pixabay.

Textbooks are more prevalent in American history courses than in any other subject, and a recent article from The New York Times revealed how geography has influenced what U.S. students learn. Despite having the same publisher, textbooks in California and Texas (the two largest markets for textbooks in the U.S.) vary wildly in educational content. Researchers have also found numerous inconsistencies and inaccuracies in American history textbooks, resulting in the glorification of national figures and spread of national myths.

Depictions of violence in textbooks are also highly politicized. Episodes of violence are often muted or emphasized, based on a country’s role in the conflict. For example, conflicts with foreign groups or countries are more likely than internal conflicts to appear in textbooks. Additionally, American textbooks consistently fail to acknowledge non-American casualties in their depictions of war, citing American soldiers as victims, rather than perpetrators of the horrors of war. Depictions of conflicts also vary over time, and as time passes, textbooks move away from nationalistic narratives to focus instead on individualistic narratives.
Public figures, like Hellen Keller and Abraham Lincoln, tend to be “heroified” in American textbooks. Rather than treating these public figures as flawed individuals who have accomplished great things, American textbooks whitewash their personal histories. For example, textbooks overlook Keller’s fight for socialism and support of the USSR and Lincoln’s racist beliefs. The heroification of these figures is meant to inspire the myth of the American Dream — that if you work hard, you can achieve anything, despite humble beginnings.
Symbolic representation of the past is important in stratified societies because it affects how individuals think about their society. Emphasizing the achievements of individuals with humble beginnings promotes the belief among American students that if they work hard they can achieve their goals, despite overwhelming structural inequalities. Furthermore, as historical knowledge is passed down from one generation to the next, this knowledge becomes institutionalized and reified–making it more difficult to challenge or question.
Cartoon. Six blind men touch different parts of an elephant and each has a different idea of what the elephant is based on what they've touched

This post was created in collaboration with the Minnesota Journalism Center

Objectivity and neutrality have been cornerstone norms of journalistic practice in democracies in the Western world for over a century. However, in recent years ideals of fairness, accuracy, and balance have come under increasing attack from many different and sometimes unexpected directions. 

Many beliefs about the need for media objectivity go back to Alexis de Tocqueville’s 19th century argument that the circulation of newspapers are integral to fostering a functional and effective democracy. Indeed, objectivity became a news value in the 1830s, partly to do with the rise of the Associated Press (AP), created in 1848 by a group of New York newspapers that wanted to take advantage of the speed of the telegraph in transmitting news to multiple outlets. To transmit news to a variety of news outlets with a variety of political allegiances consistently, a sense of objectivity had to be maintained to be relevant to as wide an audience and clientele as possible. 
Cutting against these norms was the sensationalism of newspaper content in the late 19th century. While the use of emotion in reporting has often been connected to the commercialization and tabloidization of journalism, in recent years it has also appeared in coverage of disasters, crises, and human rights abuses — and has come to be seen as positive and valuable as well. The roles of objectivity and impartiality have always been contested within journalistic practice, so rather than seeing emotion as the opposite of objectivity, some scholars now argue it can come alongside and inform journalistic practice worldwide.
The role of objectivity has also come into question as a mechanism that can silence marginalized writers and populations. Relatedly, news can also reinforce institutions of power in society, for better or for worse. In populist countries including Argentina, Bolivia, Ecuador, Nicaragua, and Venezuela, “professional journalism” is often pitted against “militant journalism” promoted by neo-populist governments and their sympathizers — a movement that has critical implications for the freedom of the press in societies in the Global South. Also, news media has been found to negatively portray protests and protesters.

It’s Black History Month, and we at TSP have rounded up some of our favorite, timeless posts about the history, meaning, and importance of celebrating black history. These #TSPClassics include articles about Black History Month itself, as well as articles about research related to racial identity, racism, and anti-racism. Read about Black scholars’ early contributions to social sciences, recent innovations in scholarship about race, ongoing issues of racism and inequality, new strategies and actions in advocacy, and much more below; happy Black History Month!

From Our Main Page

Did you know W.E.B DuBois was a pioneering sociologist? Read more at “What Would W.E.B DuBois Do?
Photo of a mural honoring black history in Philadelphia. Photo by 7beachbum, Flickr CC.
Read about black women’s advanced sociology and social science at “Unearthing Black Women’s Early Contributions to Sociology.”
Read about why the idea of a “white history month” ignores the history of race and racism at “Why We Don’t Need a White History Month.”
“Black Panther,” one of the most successful movies in the Marvel universe, was a momentous film for black representation and imagery in Hollywood. Read more at “Black Panther as a Defining Moment for Black America.”
The word “racism” can mean a lot of different things in different contexts; read about different definitions, forms, and research traditions regarding “racism” in the USA at “Different Dimensions of Racism.” 
Even in the 21st century, Black Americans have to navigate racist stereotypes, imagery, and perceptions, and many learn such strategies at a young age. Read about related parenting strategies and challenges at “How Black Mothers Struggle to Navigate ‘Thug’ Imagery.”
Recent research about black identities, experiences, and community analyzes how themes studied by early sociologists of race relate to twenty-first century technology, such as social media platforms and digital communication. Read more about these and other new research directions at “A Thick Year For Tressie McMillan Cottom.” 
Tressie Mc Millan Cottom displays her essay collection Thick, which was nominated as a National Book Award Finalist. Photo via Wikimedia Commons.
Black athletes represent a new generation of leaders and anti-racist advocates; read more at “A New Era of Athlete Awareness and Advocacy.” 
Should educators promote colorblind rhetoric in the classroom? Read about problems with colorblind teaching practices at “Color-Blind Classrooms Socialize Students to Disregard History.”
Research shows that skin color intersects with race and racial identity in ways which perpetuate racial inequality. Read more at “Skin Color, Self-Identity, and Perceptions of Race.” 
Social norms, rules, and laws about mixed-race relationships have changed drastically across history, but many issues of inequality and identity remain for contemporary multiracial families. Read more at “Navigating Multiracial Identities.” 
Photo of a multiracial family by taylormackenzie, Flickr CC.

From Our Partners and Community Pages

Soc Images

Rural Appalachia is often discussed as a mainly-white region, but did you know about the richness of black history in the mountains? Read more at “Hidden Black History in Appalachia.”
Rural sharecroppers in Appalachia. Source: Wikimedia Commons
Sometimes, businesses, corporations, and groups try to celebrate Black History Month in ways which are tone-deaf, ignorant, and just plain racist. SocImages archives several cringeworthy incidents over the years at “From Our Archives: Black History Month.”

Contexts

The field of sociology studies racism, but we’re not above criticism; read about why social science must divest from whiteness and white-centric logic at “Yes, Sociology is Racist Too.”
Why don’t we make WEB DuBois’ birthday a holiday? Read more at “A New Black Holiday, or Why W.E.B. DuBois’s 150th birthday matters.”
Well into the 21st century, discrepancies in the justice system are still a major site of racial inequality; read about racial inequality and policing at “Black and Blue.”
Hand holding a diamond. Photo via pxfuel.

Over one million people will get engaged on Valentine’s Day, and as a result, diamond sales usually uptick around this time. Diamonds are both historical and cultural objects; they carry meaning for many — symbolizing love, commitment, and prestige. Diamonds are highly coveted objects, and scholars have found about 90 percent of American women own at least one diamond. In the 1990s, war spread throughout West Africa over these precious pieces of carbon, as armed political groups vied for control over diamond mines and their profits.

Given their role in financing brutal West African civil wars, diamonds became associated with violence and international refugee crises, rather than financial prosperity and love. Diamonds became pejoratively known as blood diamonds, or conflict diamonds, and consumers became more likely to perceive diamonds as the result of large scale violence and rape.  As a result, major diamond producers have attempted to reconstruct the symbolic meaning of diamonds, turning them into symbols of international development and hope.
As the diamond trade became immoral and socially unjust, new global norms emerged around corporate and consumer responsibility. Non-governmental organizations (NGOs) lobbied for the diamond industry to change their behaviors and support of conflict mines while simultaneously creating new global norms and expectations. In the early 2000s, international NGOs, governments and the diamond industry came together to develop the Kimberley Process — to stop the trade of conflict diamonds. Today, 75 countries participate, accounting for 99% of the global diamond trade. 
Bieri & Boli argue that when NGOs urge companies to employ social responsibility in their commercial practice, they are mobilizing a global moral order. Diamonds provide an example of how symbols, products, and meaning are socially and historically constructed and how this meaning can change over time. The case of blood diamonds also illustrates how changing global norms about what is and is not acceptable can redefine the expectations of how industries conduct business.
A student takes notes by hand. Photo via Wikimedia Commons.

If you believe that taking notes longhand is better than typing (especially that typing leads to verbatim transcription but writing helps with processing ideas), you have probably seen a reference to Mueller and Oppenheimer (2014). We even featured it in a TROT last fall. The Pen is Mightier Than the Keyboard has over 900 citations on Google Scholar, but is also a staple on Twitter, blogs, and news articles. According to Altmetric, it has been cited in 224 stories in 137 news outlets the past two years (two years after it was published), and linked to on Twitter almost 3,000 times. 

But new research suggests that its fame was premature. How Much Mightier Is the Pen than the Keyboard for Note-Taking? A Replication and Extension of Mueller and Oppenheimer argues that research has not yet determined whether writing or typing is categorically better for class achievement. The new study (a direct replication of the original study) did find a slight advantage for those taking notes by hand, but unlike in the original study, the differences were not statistically significant. 
The new study also expanded the original by including a group with eWriters, an electronic form of notetaking that allows students to write on a screen. As our original blog noted, much of the research on laptops in the classroom revolves around their potential to be distractions, and research on notetaking needs to take into account advances in technology that could lead to better notetaking environments as well as assisting students with disabilities. Morehead, Dunlosky, and Rawson find that eWriters, although not in common use, could be a bridge between paper and the distraction-laden environment of laptops. 
A volunteer donates blood. Photo via pxfuel.

In the past year, the American Red Cross issued several statements regarding critical blood shortages in various locations throughout the United States. Blood shortages are not unique to the United States; a recent study by the World Health Organization found 107 out of 180 countries have insufficient amounts of blood. While organizations like the American Red Cross try to remedy blood shortages, sociologists have found that blood shortages are closely related to donors’ feelings of altruism and the existing structures of donor organizations. 

Social psychologists have explained the decision to give blood in terms of altruism, acting in the interest of others, while sociologists tend to explain blood donation in terms of organizations and institutions. Voluntary donations have historically been portrayed as more desirable or as a civic duty, but scholars note that the most common reason for not giving blood is simply not being asked. They also find that personal motivations (such as a general desire to help, sense of duty, and empathy) are more likely to be strengthened with each donation, while external motivations (emergencies, peer pressure, etc.) are likely to decrease over time.
As a result, donation centers have been encouraged not to pay donors due to a fear that this would discourage altruistic givers. Paying donors also raised other concerns, such as the belief that paying donors would encourage exploitative relationships between economically unstable individuals and donation centers. Additionally, there were also fears that paid blood was unsafe blood, as it would motivate high-risk groups to lie about their status for money. 
Altruism is not random or individual, it is driven by institutions. For example, in places where the Red Cross is prevalent, people involved in religious or volunteer organizations donate the most blood. Alternatively, in countries where independent blood banks operate, this is not true. In fact, state systems, according to Healy, tend to have larger donor bases. Thus, the organizational, rather than individual desire to give, largely drives blood donations.
Photo by torbakhopper, Flickr CC

Originally published July 30, 2019.

As candidates gear up for this week’s democratic debates, constituents continue to voice concerns about the student debt crisis. Recent estimates indicate that roughly 45 million students in the United States have incurred student loans during college. Democratic candidates like Senators Elizabeth Warren and Bernie Sanders have proposed legislation to relieve or cancel  this debt burden. Sociologist Tressie McMillan Cottom’s congressional testimony on behalf of Warren’s student loan relief plan last April reveals the importance of sociological perspectives on the debt crisis. Sociologists have recently documented the conditions driving student loan debt and its impacts across race and gender. 

In recent decades, students have enrolled in universities at increasing rates due to the “education gospel,” where college credentials are touted as public goods and career necessities, encouraging students to seek credit. At the same time, student loan debt has rapidly increased, urging students to ask whether the risks of loan debt during early adulthood outweigh the reward of a college degree. Student loan risks include economic hardship, mental health problems, and delayed adult transitions such as starting a family. Individual debt has also led to disparate impacts among students of color, who are more likely to hail from low-income families. Recent evidence suggests that Black students are more likely to drop out of college due to debt and return home after incurring more debt than their white peers. Racial disparities in student loan debt continue into their mid-thirties and impact the white-Black racial wealth gap.
Other work reveals gendered disparities in student debt. One survey found that while women were more likely to incur debt than their male peers, men with higher levels of student debt were more likely to drop out of college than women with similar amounts of debt. The authors suggest that women’s labor market opportunities — often more likely to require college degrees than men’s — may account for these differences. McMillan Cottom’s interviews with 109 students from for-profit colleges uncovers how Black, low-income women in particular bear the burden of student loans. For many of these women, the rewards of college credentials outweigh the risks of high student loan debt.
Photo of a plaque commemorating Ida B. Wells. Photo by Adam Jones, Flickr CC

As Black History month draws to a close, it’s important to celebrate the work of Black scholars that contributed to social science research. Although the discipline has begun to recognize the foundational work of scholars like W.E.B. DuBois, academia largely excluded Black women from public intellectual space until the mid-20th century. Yet, as Patricia Hill Collins reminds us, they leave contemporary sociologists with a a long and rich intellectual legacy. This week we celebrate the (often forgotten) Black women who continue to inspire sociological studies regarding Black feminist thought, critical race theory, and methodology.

Ida B. Wells (1862-1931) was a pioneering social analyst and activist who wrote and protested against many forms of racism and sexism during the late 19th and early 20th century. She protested Jim Crow segregation laws, founded a Black women’s suffrage movement, and became one of the founding members of the NAACP. But Wells is best-known for her work on lynchings and her international anti-lynching campaign. While Wells is most commonly envisioned as a journalist by trade, much of her work has inspired sociological research. This is especially true for her most famous works on lynchings, Southern Horrors (1892) and The Red Record (1895).
In Southern Horrors (1892), Wells challenged the common justification for lynchings of Black men for rape and other crimes involving white women. She adamantly criticized white newspaper coverage of lynchings that induced fear-mongering around interracial sex and framed Black men as criminals deserving of this form of mob violence. Using reports and media coverage of lynchings – including a lynching of three of her close friends – she demonstrated that lynchings were not responses to crime, but rather tools of political and economic control by white elites to maintain their dominance. In The Red Record (1895), she used lynching statistics from the Chicago Tribune to debunk rape myths, and demonstrated how the pillars of democratic society, such as right to a fair trial and equality before the law, did not extend to African American men and women.
Anna Julia Cooper (1858-1964) was an avid educator and public speaker. In 1982, her first book was published, A Voice from the South: By A Black Woman of the South. It was one of the first texts to highlight the race- and gender-specific conditions Black women encountered in the aftermath of Reconstruction. Cooper argued that Black women’s and girls’ educational attainment was vital for the overall progress of Black Americans. In doing so, she challenged notions that Black Americans’ plight was synonymous with Black men’s struggle. While Cooper’s work has been criticized for its emphasis on racial uplift and respectability politics, several Black feminists credit her work as crucial for understanding intersectionality, a fundamentally important idea in sociological scholarship today.
As one of the first Black editors for an American Sociological Association journal, Jacquelyn Mary Johnson Jackson (1932-2004) made significant advances in medical sociology. Her work focused on the process of aging in Black communities. Jackson dismantled assumptions that aging occurs in a vacuum. Instead, her scholarship linked Black aging to broader social conditions of inequality such as housing and transportation. But beyond scholarly research, Jackson sought to develop socially relevant research that could reach the populations of interest. As such, she identified as both a scholar and activist and sought to use her work as a tool for liberation.

Together, these Black women scholars challenged leading assumptions regarding biological and cultural inferiority, Black criminality, and patriarchy from both white and Black men. Their work and commitment to scholarship demonstrates how sociology may be used as a tool for social justice. Recent developments such as the #CiteBlackWomen campaign draw long-overdue attention to their work, encouraging the scholarly community to cite Wells, Cooper, Jackson, and other Black women scholars in our research and syllabi.