Covid-19 may be bringing long-term changes to workplaces and leisure activities as people become more attuned to potential infectious disease. But our shock, surprise, and general inability to deal with the virus also tells us something about how much our relationship with disease has changed. 

Graph showing the birth rates, death rate, and total population during each of the 5 stages of epidemiological transition. Image via Wikimedia Commons.

What scientists call the “epidemiological transition” has drastically increased the age of mortality. In other words, in the first two phases of the epidemiological transition lots of people died young, often of infection. Advancements in medicine and public health pushed the age of mortality back, and in later phases of the transition the biggest killers became degenerative diseases like heart disease and cancer. In phase four, our current phase, we have the technology to delay those degenerative diseases, and we occasionally fight emerging infections like AIDS or covid-19. Of course, local context matters, and although the general model above seems to fit the experience of many societies over a long period of time, it’s not deterministic. 


Even before the epidemiological transition, not everyone had the same risk of contracting a deadly infection. Data from the urban U.S. shows that the level of mortality experienced by white Americans during the 1918 flu (a historic level considered to be a once-in-a-lifetime event by demographers), was the same level of mortality experienced by nonwhite Americans in every county in every year prior to 1918. 

Rise of new infectious diseases

Clearly, as we are seeing today, the epidemiological transition isn’t a smooth line. There is also considerable year-to-year and place-to-place variability, and new diseases can cause a sharp uptick in infectious disease deaths. For instance, the emergence of AIDS in the 1980s was responsible for a rise in infectious mortality and demonstrated the need to be prepared for new diseases. 

In just a few short weeks, covid-19 became a leading cause of death in the United States. The pandemic is a reminder that despite all of our health advances we aren’t beyond the disruptions of infectious disease, despite the broader long-term shift from high rates of childhood mortality to high rates of degenerative disease among elders.

Astrological signs from alchemical text entitled “Opus medico-chymicum” published in 1618 by Johann Daniel Mylius, via Wikimedia Commons.

Astrology is on the rise, and a recent New Yorker article argues that 30% of Americans now believe in astrology. This spike in belief has been tied to astrology’s popularity on the internet and social media. Astrological apps like Co-star and Align have gained traction, achieving millions of downloads a year, and mystical services more generally are generating 2.2 billion annually. But why is astrology on the rise? And what does sociology have to say about its practice? 

During the 1970s, astrology was marginalized and socially stigmatized — considered part of the American counter-culture. The rise of religious nones and the “spiritual but not religious” category have led scholars to consider how belief systems once considered to be alternative may be becoming more mainstream. Scholars have found that even spiritual beliefs that are not part of organized religion may be highly organized in generating meaning and community, particularly in unsettled times. Given stressors like global warming, economic instability, and the recent COVID-19 pandemic, millennials may be turning to belief-structures once considered to be alternative to find community and to grapple with uncertainty.
A young girl blows out a birthday candle with help from her grandmother and brother. In times of economic recession, many individuals experiencing hardship receive help from extended family members. Photo via Wallpaper Flare.

A few weeks ago, the Dow Jones fell 20% from its high. To many this is an indicator of great uncertainty about the future of the market and could be an indication of a coming recession. And this was even before the crisis of COVID-19. In these uncertain economic times, we look back to how the Great Recession of 2008 impacted families’ decision-making including whether or not to have children, where to live, and how much to rely on family members for financial support. Perhaps this recent history can help us imagine what might lie ahead.

Demographers and other social scientists are interested in the fertility rate, or the number of live births over the lifetime of a child-bearing woman. After the Great Recession, the fertility rate fell to below replacement rate (or 2 children per every one woman) for the first time since 1987. Scholars attribute this change in fertility to increased economic uncertainty; people do not feel confident about having a child if they are not sure what will come next. In fact, the fertility rate fell lowest in states with the most uncertainty, those hit hardest by the recession and “red states” concerned about the economic future under Obama.
  • Schneider, Daniel. 2015. “The Great Recession, Fertility, and Uncertainty: Evidence From the United States.” Journal of Marriage and Family 77(5):1144–56.
  • Guzzo, Karen Benjamin, and Sarah R. Hayford. 2020. “Pathways to Parenthood in Social and Family Contexts: Decade in Review, 2020.” Journal of Marriage and Family 82(1):117–44.
  • Sobotka, Tomáš, Vegard Skirbekk, and Dimiter Philipov. 2011. “Economic Recession and Fertility in the Developed World.” Population and Development Review 37(2):267–306.
During the Great Recession there was also an increase in the number of young adults, both single and married, living with their parents. Rates of both young married adults living with their parents increased in 2006 to reach 1900 levels which is surprising considering that the century between 1900 and 2000 is considered the “age of independence,” when many more young people moved out and established households of their own. This effect was particularly strong for young adults with less education and those who had fewer resources to weather the storm of the recession on their own.
The economic challenges of the late 2000s also may have led to an increase in interpersonal conflict within families. In part, this may stem from the pressure family members feel to serve as a safety net for relatives who are struggling financially. For instance, Jennifer Sherman found that some individuals who were experiencing financial hardship withdrew from their extended family during the Great Recession rather than asking for support. This, along with findings that giving money to family members during the recession increased an individual’s chance of experiencing their own financial stress, raises questions about whether or not family networks can offer support in times of economic turmoil.
  • Sherman, Jennifer. 2013. “Surviving the Great Recession: Growing Need and the Stigmatized Safety Net.” Social Problems 60(4):409–32.
  • Pilkauskas, Natasha V., Colin Campbell, and Christopher Wimer. 2017. “Giving Unto Others: Private Financial Transfers and Hardship Among Families With Children.” Journal of Marriage and Family 79(3):705–22.
As the economic effects of covid-19 are felt across the country, many Americans are preparing for another severe economic downturn. Understanding how the Great Recession influenced family structure and life is an important lens for considering how large economic events shape people’s everyday lived experiences.
Many stacks of textbooks. Photo via Pixabay.

Textbooks are more prevalent in American history courses than in any other subject, and a recent article from The New York Times revealed how geography has influenced what U.S. students learn. Despite having the same publisher, textbooks in California and Texas (the two largest markets for textbooks in the U.S.) vary wildly in educational content. Researchers have also found numerous inconsistencies and inaccuracies in American history textbooks, resulting in the glorification of national figures and spread of national myths.

Depictions of violence in textbooks are also highly politicized. Episodes of violence are often muted or emphasized, based on a country’s role in the conflict. For example, conflicts with foreign groups or countries are more likely than internal conflicts to appear in textbooks. Additionally, American textbooks consistently fail to acknowledge non-American casualties in their depictions of war, citing American soldiers as victims, rather than perpetrators of the horrors of war. Depictions of conflicts also vary over time, and as time passes, textbooks move away from nationalistic narratives to focus instead on individualistic narratives.
Public figures, like Hellen Keller and Abraham Lincoln, tend to be “heroified” in American textbooks. Rather than treating these public figures as flawed individuals who have accomplished great things, American textbooks whitewash their personal histories. For example, textbooks overlook Keller’s fight for socialism and support of the USSR and Lincoln’s racist beliefs. The heroification of these figures is meant to inspire the myth of the American Dream — that if you work hard, you can achieve anything, despite humble beginnings.
Symbolic representation of the past is important in stratified societies because it affects how individuals think about their society. Emphasizing the achievements of individuals with humble beginnings promotes the belief among American students that if they work hard they can achieve their goals, despite overwhelming structural inequalities. Furthermore, as historical knowledge is passed down from one generation to the next, this knowledge becomes institutionalized and reified–making it more difficult to challenge or question.
Hand holding a diamond. Photo via pxfuel.

Over one million people will get engaged on Valentine’s Day, and as a result, diamond sales usually uptick around this time. Diamonds are both historical and cultural objects; they carry meaning for many — symbolizing love, commitment, and prestige. Diamonds are highly coveted objects, and scholars have found about 90 percent of American women own at least one diamond. In the 1990s, war spread throughout West Africa over these precious pieces of carbon, as armed political groups vied for control over diamond mines and their profits.

Given their role in financing brutal West African civil wars, diamonds became associated with violence and international refugee crises, rather than financial prosperity and love. Diamonds became pejoratively known as blood diamonds, or conflict diamonds, and consumers became more likely to perceive diamonds as the result of large scale violence and rape.  As a result, major diamond producers have attempted to reconstruct the symbolic meaning of diamonds, turning them into symbols of international development and hope.
As the diamond trade became immoral and socially unjust, new global norms emerged around corporate and consumer responsibility. Non-governmental organizations (NGOs) lobbied for the diamond industry to change their behaviors and support of conflict mines while simultaneously creating new global norms and expectations. In the early 2000s, international NGOs, governments and the diamond industry came together to develop the Kimberley Process — to stop the trade of conflict diamonds. Today, 75 countries participate, accounting for 99% of the global diamond trade. 
Bieri & Boli argue that when NGOs urge companies to employ social responsibility in their commercial practice, they are mobilizing a global moral order. Diamonds provide an example of how symbols, products, and meaning are socially and historically constructed and how this meaning can change over time. The case of blood diamonds also illustrates how changing global norms about what is and is not acceptable can redefine the expectations of how industries conduct business.
A student takes notes by hand. Photo via Wikimedia Commons.

If you believe that taking notes longhand is better than typing (especially that typing leads to verbatim transcription but writing helps with processing ideas), you have probably seen a reference to Mueller and Oppenheimer (2014). We even featured it in a TROT last fall. The Pen is Mightier Than the Keyboard has over 900 citations on Google Scholar, but is also a staple on Twitter, blogs, and news articles. According to Altmetric, it has been cited in 224 stories in 137 news outlets the past two years (two years after it was published), and linked to on Twitter almost 3,000 times. 

But new research suggests that its fame was premature. How Much Mightier Is the Pen than the Keyboard for Note-Taking? A Replication and Extension of Mueller and Oppenheimer argues that research has not yet determined whether writing or typing is categorically better for class achievement. The new study (a direct replication of the original study) did find a slight advantage for those taking notes by hand, but unlike in the original study, the differences were not statistically significant. 
The new study also expanded the original by including a group with eWriters, an electronic form of notetaking that allows students to write on a screen. As our original blog noted, much of the research on laptops in the classroom revolves around their potential to be distractions, and research on notetaking needs to take into account advances in technology that could lead to better notetaking environments as well as assisting students with disabilities. Morehead, Dunlosky, and Rawson find that eWriters, although not in common use, could be a bridge between paper and the distraction-laden environment of laptops. 
A volunteer donates blood. Photo via pxfuel.

In the past year, the American Red Cross issued several statements regarding critical blood shortages in various locations throughout the United States. Blood shortages are not unique to the United States; a recent study by the World Health Organization found 107 out of 180 countries have insufficient amounts of blood. While organizations like the American Red Cross try to remedy blood shortages, sociologists have found that blood shortages are closely related to donors’ feelings of altruism and the existing structures of donor organizations. 

Social psychologists have explained the decision to give blood in terms of altruism, acting in the interest of others, while sociologists tend to explain blood donation in terms of organizations and institutions. Voluntary donations have historically been portrayed as more desirable or as a civic duty, but scholars note that the most common reason for not giving blood is simply not being asked. They also find that personal motivations (such as a general desire to help, sense of duty, and empathy) are more likely to be strengthened with each donation, while external motivations (emergencies, peer pressure, etc.) are likely to decrease over time.
As a result, donation centers have been encouraged not to pay donors due to a fear that this would discourage altruistic givers. Paying donors also raised other concerns, such as the belief that paying donors would encourage exploitative relationships between economically unstable individuals and donation centers. Additionally, there were also fears that paid blood was unsafe blood, as it would motivate high-risk groups to lie about their status for money. 
Altruism is not random or individual, it is driven by institutions. For example, in places where the Red Cross is prevalent, people involved in religious or volunteer organizations donate the most blood. Alternatively, in countries where independent blood banks operate, this is not true. In fact, state systems, according to Healy, tend to have larger donor bases. Thus, the organizational, rather than individual desire to give, largely drives blood donations.
Photo of a plaque commemorating Ida B. Wells. Photo by Adam Jones, Flickr CC

As Black History month draws to a close, it’s important to celebrate the work of Black scholars that contributed to social science research. Although the discipline has begun to recognize the foundational work of scholars like W.E.B. DuBois, academia largely excluded Black women from public intellectual space until the mid-20th century. Yet, as Patricia Hill Collins reminds us, they leave contemporary sociologists with a a long and rich intellectual legacy. This week we celebrate the (often forgotten) Black women who continue to inspire sociological studies regarding Black feminist thought, critical race theory, and methodology.

Ida B. Wells (1862-1931) was a pioneering social analyst and activist who wrote and protested against many forms of racism and sexism during the late 19th and early 20th century. She protested Jim Crow segregation laws, founded a Black women’s suffrage movement, and became one of the founding members of the NAACP. But Wells is best-known for her work on lynchings and her international anti-lynching campaign. While Wells is most commonly envisioned as a journalist by trade, much of her work has inspired sociological research. This is especially true for her most famous works on lynchings, Southern Horrors (1892) and The Red Record (1895).
In Southern Horrors (1892), Wells challenged the common justification for lynchings of Black men for rape and other crimes involving white women. She adamantly criticized white newspaper coverage of lynchings that induced fear-mongering around interracial sex and framed Black men as criminals deserving of this form of mob violence. Using reports and media coverage of lynchings – including a lynching of three of her close friends – she demonstrated that lynchings were not responses to crime, but rather tools of political and economic control by white elites to maintain their dominance. In The Red Record (1895), she used lynching statistics from the Chicago Tribune to debunk rape myths, and demonstrated how the pillars of democratic society, such as right to a fair trial and equality before the law, did not extend to African American men and women.
Anna Julia Cooper (1858-1964) was an avid educator and public speaker. In 1982, her first book was published, A Voice from the South: By A Black Woman of the South. It was one of the first texts to highlight the race- and gender-specific conditions Black women encountered in the aftermath of Reconstruction. Cooper argued that Black women’s and girls’ educational attainment was vital for the overall progress of Black Americans. In doing so, she challenged notions that Black Americans’ plight was synonymous with Black men’s struggle. While Cooper’s work has been criticized for its emphasis on racial uplift and respectability politics, several Black feminists credit her work as crucial for understanding intersectionality, a fundamentally important idea in sociological scholarship today.
As one of the first Black editors for an American Sociological Association journal, Jacquelyn Mary Johnson Jackson (1932-2004) made significant advances in medical sociology. Her work focused on the process of aging in Black communities. Jackson dismantled assumptions that aging occurs in a vacuum. Instead, her scholarship linked Black aging to broader social conditions of inequality such as housing and transportation. But beyond scholarly research, Jackson sought to develop socially relevant research that could reach the populations of interest. As such, she identified as both a scholar and activist and sought to use her work as a tool for liberation.

Together, these Black women scholars challenged leading assumptions regarding biological and cultural inferiority, Black criminality, and patriarchy from both white and Black men. Their work and commitment to scholarship demonstrates how sociology may be used as a tool for social justice. Recent developments such as the #CiteBlackWomen campaign draw long-overdue attention to their work, encouraging the scholarly community to cite Wells, Cooper, Jackson, and other Black women scholars in our research and syllabi.

Student Athletes from the Sierra College Football team play in the pre-season football scrimmage at Sierra College in Rocklin, Calif. on August 20, 2016. (Photo by davidmoore326, licensed under CC BY-NC-ND 2.0)

Thanksgiving has NFL games, Christmas has the NBA, and New Year’s has college football. This season as you sit down to watch bowl games or the college football playoff, check out some of the sociological college football research from our partner Engaging Sports

Football can be a path toward economic opportunity, but scholars find race and class patterns in who takes this risky path. For example, Black players are generally from more disadvantaged areas while white players come from more advantaged areas, perhaps indicating that white players benefit from more resources in training while financial necessity drives black players. 
Fans may not have a say in recruiting college athletes, but they certainly have strong opinions about the young athletes at their favored schools. Fans stay away from overtly racist language on message boards, but a criminal record did affect fan support of prospective athletes. 
Finally, both American football and the NCAA seem to constantly be dealing with scandal. Read the articles below for some context on current scandals within the NCAA and how the concussion crisis is affecting a number of sports. 
A woman walks alone in a dark alley. Photo by renee_mcgurk via Flickr.
While opinions of particular environments, situations, or objects may appear to be objectively dangerous or safe, sociologists argue otherwise. Instead, they find that opinions about safety are subjective. While there is a physical reality of harm and fear, beliefs about safety and danger spread through socialization, rather than direct observation. For example, Simpson notes that snakes and turtles can both cause illness and death through the transmission of venom or bacteria, yet snakes are seen as dangerous and turtles as benign. In other words, danger and safety do not exist on their own; they are contextual.
Socialized beliefs about safety and danger are also raced, classed, and gendered. While statistics indicate that men are predominantly the victims of violent crime, women express greater fear of crime. This fear often acts as a form of social control by limiting women’s daily activities, like when they leave the house and what they wear. Furthermore, the construction of fear and crime is often tied to racist legacies. In the United States, white women express prejudicial fear about areas marked as “dangerous” or “sketchy,” due to the occupation of this space by men of color.
Safety and danger are also constructed at the international level, as national security is politicized. For example, instances of large-scale political violence, such as genocide, war, and acts of terrorism revolve around the social construction of an enemy. More generally, national enemies are constructed as dangerous and a threat to the safety of a nation’s people. This construction of the enemy and perception of fear can move people to join terrorist organizations, participate in genocidal regimes, and enlist in state militaries.