culture

A sheet with two holes cut in for eye holes to resemble a ghost, sitting (or floating?) on a bed. Photo by Ryan Miguel Capili under Pexels license.

It’s a dark and stormy night, and the wind is howling as the trees tap, tap, tap along your window. Out of the night comes an unearthly noise, and an eerie feeling takes over you as the room becomes cold. Your mind begins to race as you feel a presence close by. It must be all in your head…mustn’t it?

Whether you believe in ghosts or the things that go bump in the night, the supernatural has proved to be a prevailing source of intrigue for the world and sociologists. We may not be able to prove the existence of the supernatural, but we can certainly look into the factors that shape and guide our experiences. 

Cultic Milieu

Cultic or paranormal beliefs and experiences are both wide-ranging (including unorthodox science, magic, witchcraft, astrology, mysticism, healing practices, the occult, and more) and persistent across time. 

These beliefs are unified by the fact that they are typically viewed as deviant to the dominant culture, particularly traditional religion and mainstream science. This stimulates a tolerance for other belief systems and a sense of support, as believers in the cultic or paranormal share an experience of having to justify their beliefs to mainstream society. 

Colin Campbell argues that this cultic milieu is, “an underground region where true seekers test hidden, forgotten, and forbidden knowledge.”

Supernatural Skepticism

Some scholars have also explored the process of how people come to develop cultic or paranormal beliefs. When patterns of strange and uncanny events occur, people often experience layers of doubt before concluding they have experienced a ghostly encounter. The will to believe battles against a desire to remain skeptical, especially in a highly rationalized, materialistic world. 

“Because a ghost seemingly defies rationality, the person who believes risks his or her credibility and stigmatization.”

Studying practitioners of ritual magic in London in 1983, Tanya Marie Luhrmann questioned why users practiced magic when – to the eye of the outside observer – it did not work. Luhrmann found that individuals engaged in unintentional interpretive drift (a slow shift in how someone interprets events, what events they find significant, and what patterns they notice). Over time, they began to interpret events as a result of their magical practice. For Lurhman, such beliefs and practices are not so much exceptions to the modern quest for instrumental, scientific knowledge but a direct reaction to its limitations and shortcomings. 

Hauntings

Some scholars focus on the social functions of ghostly hauntings. Hauntings may draw attention to loss (either of life or of opportunity) or reveal repressed or unresolved memories of individuals or communities (particularly memories of social violence). Ghosts can represent our empowered hopes, fears, and values. Experiences with ghosts may spur action, and–whether they truly exist or not–have real effects on those who believe in them. 

Gender and Race Belief Differences

Sociology has also found that social factors like race, education, and gender can influence someone’s perspective on the paranormal and supernatural, as one survey of American fears found. Women have been found to have a higher belief in things like ghosts, zombies, and supernatural powers while men are more likely to believe in things like bigfoot or extraterrestrials. The results suggest a difference in the material quality of the creature and its relation to scientific inquiry. 

Black people were found to have higher beliefs in alien life and ghost encounters while Asian Americans had the largest fear of zombies. White people were more likely to believe in UFOs and psychic healing. The cultural significance of religion or spirituality for race may be an influencing factor in the findings. The level of education also impacts someone’s beliefs. Those with a bachelor’s degree or higher were less likely to believe in aliens, bigfoot, ghost encounters/hauntings, and Atlantis. However, other supernatural beliefs – such as supernatural human abilities and zombies – were not impacted by education. 

The supernatural and paranormal have managed to intrigue the public for centuries and sociologists are no different. Why and how people engage with the spooky aspects of life can often tell us more about the social world than we’d first think. “To study social life one must confront the ghostly aspects of it.”Avery Gordon

A police officer in uniform wearing a walkie-talkie and a square body-camera by Sanderflight. Image from Wikipedia Commons is licensed under Creative Commons Attribution-ShareAlike 4.0 International.

The public is no stranger to “body-cams”. Images and videos from police body cameras are now a frequent feature in the media as a direct source of “what really happened” during contentious interactions between police and the public. But what have we learned from sociological research about body-cams? Who are they “for”?

Body-Cams

Law enforcement agencies have used recording technologies like dash-cams for years, but the rise of cell phone recordings and public demands for police accountability dramatically expanded the use of body-cams across the US. Body-cams, or cameras attached to the officer’s uniform, vary in the quality of video produced, the requirements regarding when to turn on the camera, and how the recordings are handled and by whom.

Research on Body-Cams

Police departments began implementing body-cams in the early 2010s. By 2015, survey showed that 19% of police departments were using body-cams in the United States. One year later in 2016, a different survey reported 47% of police departments were using body-cams – suggesting exponential growth in the use of body-cams during the mid 2010’s. As of 2023, the use of body-cams is likely higher, and many states have since ratified statewide body-cam requirements to different extents for their law enforcement.

Research has shown that officers’ perceptions of body-cams have been largely positive since they can aid and assist with arrests and investigations, and can help address excessive use of force complaints (allegations when a police officer uses more physical force than needed).

Arrests

Research on arrests and the use of body-cams shows promise. Studies have found that when body-cams are used, officers made fewer arrests, but made more citations (less serious charges than arrests). Researchers suggest that the reason for fewer arrests may be the result of both officers and citizens adjusting their behavior because they are on camera and being recorded. This decreased rate of arrest decreases the burden on police and the criminal legal justice system and can reduce harm to the broader community.

Excessive Use of Force

The results are more mixed regarding excessive use of force: some research shows a decrease in excessive force after the implementation of body-cams and some studies show no difference. These mixed findings may be tied to whether officers have discretion, or choice, to activate their cameras. In studies that found no effect on excessive use of force, officers had high discretion and could choose when (or when not) to activate their body-cams. In short, if officers have the discretion to turn their body-cams on or off, they may be more likely to use excessive force than when they are required to turn on their body-cams during an interaction.

Complaints

For formal complaints made by community members, officers who used body-cams had fewer reported complaints against them than those who did not use body-cams. Several studies have shown a marked decline in complaints after body cams are implemented, with one reporting a 90% reduction in complaints against police officers. Such complaints seem to decrease even when the body-cam was not turned on but still physically visible on the officer. It appears that “being on camera” is again impacting the behavior of both officers and community members.

Future of Body-Cams

Across the board, there are fewer and fewer outright opponents of body-cams. Public discussion now centers around the accessibility of unedited recordings, limited officer discretion of when to activate the body-cam, the privacy of bystanders to crimes, and the development of new laws regulating body-cam use. At the societal level, body-cams are generally considered an asset and a means to help both police and community members stay accountable and safe.

A child sitting at a round table, napping. Photo by RDNE Stock project is licensed under CC BY 2.0 via pexels.

Related piece originally published February 17, 2022.

The impact of COVID-19 on parents and children has forced us to reconsider how the U.S. approaches traditional welfare supports. A major change that parents saw in July 2021 under the American Rescue Plan Act (ARPA) was the increase in value of their child tax credit (CTC) and a monthly payout of half that child CTC – with $300 paid for each child under 6 years and $250 paid for each child 6-17 years each month. Furthermore, the threshold for receiving the CTC was considerably raised – temporarily lifting millions of children above the poverty line. ‘Incrementally revolutionary’ for social welfare in the U.S., the extension and expansion of the CTC had the potential to strengthen the social safety net and have a broad social impact, but was phased out (as of July 2025).

Then in 2025, the “Big Beautiful Bill” (BBB) introduced an increase from $2,000 to $2,200 – of which $1,700 is refundable and will increase with inflation. Additionally, one parent must also have a work authorized Social Security Number, which is a new condition for being eligible to receive the CTC. In addition to the CTC, the BBB also introduced new legislation intended to encourage Americans to have children, such as the Trump Child Savings Accounts.

Largely in response to unsustainable, dropping birth rates, new policies will be needed to continue to address this ticking time bomb. But while it is uncertain if the CTC increases the birth rate, we do know that CTC have a number of other positive impacts on children and families.

CTC’s History and Impacts on Children

Passed into law with bipartisan support in 1997, the CTC originally served as a tax break to middle class taxpayers. In 2001 and then 2008 the CTC was then made refundable and more accessible to lower income families.  Since the passage of the ARPA in 2021, the CTC is now more accessible and relatively generous than many other forms of welfare.

In measuring the social impact of the CTC, researchers have published ample evidence of this worthwhile investment. A nation-wide study found that when parents received the CTC their children were less likely to be physically injured and had less behavioral problems. Because children living in poverty are up to nine times more likely to fall victim to maltreatment and suffer from poor overall health, the CTC provides additional economic stability to lower-income parents. 

International programs similar to the CTC have found that increased payments were associated with lower levels of ADHD, physical aggression, maternal depression, and better emotional/anxiety scores among children. Experts in the U.S. have predicted that an increased investment in the CTC would have similar individual and social health impacts, remove millions of impoverished children out of poverty, and save billions of dollars in future. 

A man sits in front of a document, cup of coffee, and laptop, his head resting in his hands. Sunlight streams through a window to the left. Image used under CC0

Originally published March 30, 2022.


Today “help wanted” signs are commonplace; restaurants, shops, and cafes have temporarily closed or have cut back on hours due to staffing shortages. “Nobody wants to work,” the message goes. Some businesses now offer higher wages, benefits, and other incentives to draw in low-wage workers. All the same, “the great resignation” has been met with alarm across the country, from the halls of Congress to the ivory tower.

In America, where work is seen as virtuous, widespread resignations are certainly surprising.  How does so many are walking away from their jobs differ from what we’ve observed in the past, particularly in terms of frustrations about labor instability, declining benefits, and job insecurity? Sociological research on work, precarity, expectations, and emotions provides cultural context on the specificity and significance of “the great resignation.”

Individualism and Work

The importance of individualism in American culture is clear in the workplace. Unlike after World War II, when strong labor unions and a broad safety net ensured reliable work and ample benefits (for mostly white workers), instability and precarity are hallmarks of today’s workplace. A pro-work, individualist ethos values individual’s flexibility, adaptability, and “hustle.” When workers are laid off due to shifting market forces and the profit motives of corporate executives, workers internalize the blame. Instead of blaming executives for prioritizing stock prices over workers, or organizing to demand more job security, the cultural emphasis on individual responsibility encourages workers to devote their energy into improving themselves and making themselves more attractive for the jobs that are available.

Expectations and Experiences

For many, the pandemic offered a brief glimpse into a different world of work with healthier work-life balance and temporary (if meaningful) government assistance. Why and how have American workers come to expect unpredictable work conditions and meager benefits? The bipartisan, neoliberal consensus that took hold in the latter part of the twentieth century saw a reduction in government intervention into the social sphere. At the same time, a bipartisan pro-business political agenda reshaped how workers thought of themselves and their employers. Workers became individualistic actors or “companies of one” who looked out for themselves and their own interests instead of fighting for improved conditions. Today’s “precariat” – the broad class of workers facing unstable and precarious work – weather instability by expecting little from employers or the government while demanding more of themselves.

Generational Changes

Researchers have identified generational differences in expectations of work. Survey data shows that Baby Boomers experience greater difficulty with workplace instability and the emerging individualist ethos. On the other hand, younger generations – more accustomed to this precarity – manage the tumult with greater skill. These generational disparities in how insecurity is perceived have real implications for worker well-being and family dynamics.

Emotions

Scholars have also examined the central role emotions play in setting expectations of work and employers, as well as the broad realm of “emotional management” industries that help make uncertainty bearable for workers. Instead of improving workplace conditions for workers, these “emotional management” industries provide “self-care” resources that put the burden of managing the despair and anxiety of employment uncertainty on employees themselves, rather than companies.

Mother’s day is a good opportunity to surprise your mom with breakfast in bed, flowers, or a gift. It’s also a good opportunity to reflect on the challenges of motherhood, particularly in the United States, and consider how both individual and social change can help all mothers continue to thrive. We’ve rounded up some TSP classics, and some great scholarship on motherhood we haven’t covered, that puts contemporary motherhood in context.

Moms do More at Home

Although gender norms in the United States have changed considerably over the past half century, moms are still primarily responsible for raising children. Most moms are expected to figure out how to balance full-time work and motherhood. Moms must make it work when these responsibilities conflict, like when the covid-19 pandemic shut down public schools, leaving millions of children without daytime care. 

Although ostensibly gender norms are changing in heterosexual couples, mothers spend more time caring for children and doing housework than their male partners, even when both partners work outside of the home. The “second shift” of work that moms do at home includes the “cognitive labor” of managing and scheduling family members’ time. For instance, scheduling vacations, or doctors appointments for family members. 

Mothering Intensively and Alone

In the absence of public support for parenthood, It is particularly challenging for low-income moms to handle the responsibility of motherhood. The problem is not only that welfare support and childcare provisions are extremely limited in the United States; making matters worse, American culture tends to blame low income moms for their poverty and heavily scrutinizes the parenting decisions of poor moms put in tough positions and struggling to make ends meet for their families.

Another factor that makes parenting challenging for all moms are beliefs “ideal motherhood.”  Mothers are expected to mother “intensively,” devoting considerable time, energy, money, and emotion to their children. Although some parents wax nostalgic about their own childhoods, when they played independently with neighborhood children until the streetlights came on, or were “latch-key” kids free to play video games or watch television until their parents returned from work, they are now investing considerable amounts of time and energy in packed schedules of activities for their children and discipline through negotiation.

Diverse Moms, Different Experiences

Sociological research has also shown that “intensive mothering” and a focus on nuclear two-parent households may not accurately reflect the experiences of all mothers. For instance, Patricia Hill Collins talks about “collective mothering,” or how Black women rely on communities of caregivers and the work of “other moms” to help raise their children in a hostile society. Dawn Marie Dow also emphasizes that black motherhood is not necessarily incompatible with professional responsibilities, and black mothers have long had to balance work outside of their own home with the responsibilities of motherhood.

Sociological research also shows that for some moms, the expectations that the institutions of social life have for “good motherhood” don’t fit with their reality. They experience challenging situations that require them to, for instance, prioritize the safety of their children or make tough decisions about what expenses they can cover for their child. Some moms use “inventive mothering” to figure out how to meet their children’s basic needs for, for instance, diapers. Disabled moms and black moms are particularly vulnerable to being seen as “risky” for failing to live up to the ideals of motherhood, experiencing increased surveillance and punishment from doctors’ offices, schools, and child welfare workers. 

Black mothers, in particular, worry about the safety of their children in a world that often views black children as a threat, particularly black boys. Black mothers’ worry about their children experiencing racism can negatively impact their health. Cynthia G. Colen and colleagues found that children’s experiences of discrimination harmed black mother’s health. 

Gendered expectations of women also create challenges for women who cannot or do not want to become mothers. Women that experience infertility experience stigma, or the sense that there is something marked or discrediting about them that contributes to others’ negative perception of them. Women who are “childfree by choice” also experience stigma. 

Political and Personal Solutions?

Policy changes could ease some of the challenges mothers face. For instance, research shows that there is a smaller “happiness gap” between parents and non-parents in countries with more generous public support for raising children. Mothers also feel less guilt in countries with better social and economic support for parenthood. More generous welfare provisions could help working-class moms better meet their children’s basic needs. 

Within families, couples can work towards greater equality of responsibilities. However, studies show that most young people still expect mothers to do the majority of housework and childcare. Even when young women anticipate having more gender equality in household labor, actually implementing more egalitarian schedules proves difficult, particularly for working-class women. 

A man sits in front of a document, cup of coffee, and laptop, his head resting in his hands. Sunlight streams through a window to the left. Image used under CC0.

Today “help wanted” signs are commonplace; restaurants, shops, and cafes have temporarily closed or have cut back on hours due to staffing shortages. “Nobody wants to work,” the message goes. Some businesses now offer higher wages, benefits, and other incentives to draw in low-wage workers. All the same, “the great resignation” has been met with alarm across the country, from the halls of Congress to the ivory tower.

In America, where work is seen as virtuous, widespread resignations are certainly surprising.  How does so many are walking away from their jobs differ from what we’ve observed in the past, particularly in terms of frustrations about labor instability, declining benefits, and job insecurity? Sociological research on work, precarity, expectations, and emotions provides cultural context on the specificity and significance of “the great resignation.”

Individualism and Work

The importance of individualism in American culture is clear in the workplace. Unlike after World War II, when strong labor unions and a broad safety net ensured reliable work and ample benefits (for mostly white workers), instability and precarity are hallmarks of today’s workplace. A pro-work, individualist ethos values individual’s flexibility, adaptability, and “hustle.” When workers are laid off due to shifting market forces and the profit motives of corporate executives, workers internalize the blame. Instead of blaming executives for prioritizing stock prices over workers, or organizing to demand more job security, the cultural emphasis on individual responsibility encourages workers to devote their energy into improving themselves and making themselves more attractive for the jobs that are available.

Expectations and Experiences

For many, the pandemic offered a brief glimpse into a different world of work with healthier work-life balance and temporary (if meaningful) government assistance. Why and how have American workers come to expect unpredictable work conditions and meager benefits? The bipartisan, neoliberal consensus that took hold in the latter part of the twentieth century saw a reduction in government intervention into the social sphere. At the same time, a bipartisan pro-business political agenda reshaped how workers thought of themselves and their employers. Workers became individualistic actors or “companies of one” who looked out for themselves and their own interests instead of fighting for improved conditions. Today’s “precariat” – the broad class of workers facing unstable and precarious work – weather instability by expecting little from employers or the government while demanding more of themselves.

Generational Changes

Researchers have identified generational differences in expectations of work. Survey data shows that Baby Boomers experience greater difficulty with workplace instability and the emerging individualist ethos. On the other hand, younger generations – more accustomed to this precarity – manage the tumult with greater skill. These generational disparities in how insecurity is perceived have real implications for worker well-being and family dynamics.

Emotions

Scholars have also examined the central role emotions play in setting expectations of work and employers, as well as the broad realm of “emotional management” industries that help make uncertainty bearable for workers. Instead of improving workplace conditions for workers, these “emotional management” industries provide “self-care” resources that put the burden of managing the despair and anxiety of employment uncertainty on employees themselves, rather than companies.

This image has an empty alt attribute; its file name is When-Trauma-is-Passed-Down-TROT-Image-600x400.jpg
Image description: Mohammed, a Somali exile, sits in a chair on the right-hand side of the image. his children sit on the floor around him, as they discuss art. Art covers the wall. Creating cultural products is one way that communities process trauma. Image courtesy of UNHCR, CC BY-NC 2.0.

Originally published April 12, 2021

Scientific developments in the field of epigenetics have called attention to intergenerational transfers of trauma. We know now that traumatic experiences can be passed down through the genes— to children, and even grandchildren, of the survivors of horrific experiences like the Holocaust or American slavery. Sociology can help show how past trauma is passed down through social ties, and about its effects on current health and wellbeing. These social consequences of trauma could be even more powerful than the genetic impacts, affecting group dynamics, identity, history, and culture. In addition to what is passed down, sociological research also provides examples of how groups are managing these social effects, in both helpful and harmful ways. 

Cultural Trauma and Group Identity
Cultural sociologists assert that in addition to individual bodily and psychiatric trauma, there is also collective “cultural trauma” when groups experience horrific events. This collective trauma compounds and complicates individual effects. In order for the process of cultural trauma to occur, the group must first recognize that a great evil has been done to them, and construct a cohesive and shared narrative that includes perpetration and victimhood. Then this narrative becomes incorporated into that group’s collective memory as an enduring aspect of their identity, like the Holocaust has been for Jews or collective memory of slavery for Black Americans.
Both perpetrators and victims of violence must contend with the horrific event in some way, as it is now permanently associated with their group. This can occur either through avoidance of the difficult past, or stigma management practices like acknowledgment, denial, and silencing.

Cultural Trauma and Group Conflict: Violence Begets Violence

Sometimes, this cultural trauma process results in further violence. As the group comes to understand the harms they have suffered and assign responsibility, they can seek violent retaliation against the offending perpetrators. Examples include the bombing of Pearl Harbor (and subsequent Japanese internment and Hiroshima/Nagasaki bombings), and the 9/11 attacks leading to the U.S. War on Terror. In ex-Yugoslavia, ancient collective memories were stoked and reconstructed by elites to provoke inter-ethnic violence that led to ten years of war, genocide, and ethnic cleansing. In Hawai’i, Irwin and Umemoto trace the emotional and psychological effects of violent colonial subjugation, such as distress, outrage, and depression, to contemporary violence among Pacific Islander youth.

Memory Work: Social Solidarity and Empowerment

Sociological research also provides examples of people “making sense” of difficult pasts by doing “memory work,” which can include art, music, and other cultural production. For example, second-generation Sikhs in the U.S. are using internet spaces to challenge dominant narratives of the 1984 anti-Sikh violence in India, contributing to group solidarity, resilience, and identity within their communities here in the U.S. Similarly, the children of Vietnamese refugees are using graphic novels and hip-hop music to articulate how the Vietnam War contributes to current struggles in the Vietnamese community. This shared understanding and validation then empower communities to fight for recognition and social justice. 

When a group experiences a horrific event, the social effects live on to future generations. Understanding these effects is crucial for developing solutions to group suffering moving forward. Going through the cultural trauma process is necessary to overcome difficult pasts, but it is critical that this process occurs in a way that promotes justice and peace rather than further violence.

Video courtesy of canva, canva free media usage.

Originally posted November 14, 2019

As we prepare for Thanksgiving, many people look forward to sharing a warm meal with their family and friends. Others dread the holiday, gearing up to argue with their relatives or answer nosey questions. TSP has written about the political minefield that holiday meals can be in the past. This year we want to point out that the roots of difficult dinners actually run deep in everyday family mealtime. Thanksgiving, like any family mealtime, has the potential for conflict. 

Scholars have documented how important meal time can be for families in terms of cultivating relationships and family intimacy. However, they also show that despite widespread belief that families should share “happy meals” together, meals can be emotionally painful and difficult for some families and family members.
Disagreements between parents and children arise at mealtime, in part, because of the meal itself. Some caregivers go to battle with “picky eaters.” Migrant parents struggle to pass cultural food traditions to children born in the United States. Low income parents worry that their children will not like or eat the food they can afford.
Family meals also reproduce conflict between heterosexual partners. Buying, preparing, and serving food are important ways that women fulfill gendered expectations. At family meal-times men continue to do less work but hold more power about how and when dinner is served.
Thanksgiving, or any big holiday meal, can involve disagreements. However, that is not altogether surprising considering that everyday family meals are full of conflicts and tension.

Image: A little white girl sits on an adult’s lap in front of a laptop, her small hands covering the adults as they use the computer. Image courtesy of Nenad Stojkovic CC BY 2.0

Democrats in Congress continue toward passing sweeping infrastructure legislation. Part of the infrastructure packages would provide funding for childcare including universal pre-K for three and four-year-olds, aid for working families to pay for the costs of daycare, and paid family leave. Social science research helps place this current debate in perspective, connecting it to larger conversations about who is responsible for paying the costs of raising kids, the consequences for families of the private responsibility for childcare, and what international comparison can show us about alternatives. 

Part of the question concerns whether we should think of raising children as a social, rather than individual, responsibility. Public investments in childcare, whether through public assistance to cover the cost of childcare or a public system of universal childcare, are one way that countries communicate who is responsible for reproductive labor: the work of having and caring for children. In the United States, this is often thought of as the responsibility of individual families and, historically, mothers. Feminist scholars, in particular, have critiqued the individualization of responsibility for raising children, emphasizing that the work of having and raising children benefits society across the board. Having kids creates the next generation of workers and tax-payers, carrying on both practical and cultural legacies. Scholars argue that because we all benefit from the work of reproducing the population we should all share its costs and responsibilities.

Other wealthy Western nations handle childcare differently. For instance, in Sweden there is subsidized childcare available for all children that is considered high quality and is widely utilized. In Germany, there is greater availability of well-paying part-time jobs that can enable two-parent households to better balance the responsibilities of work with the demands of raising kids. In the United States, there is now virtually no public support for childcare. Parents are left to their own devices to figure out how to cover the time before the start of public school at age five as well as childcare for before or after school, and during school vacations. The U.S. is not alone in expecting families to provide childcare, for instance, Italy has a culture of “familialism” that expects extended family and, in particular, grandparents to provide free childcare for working families. However, as Caitlyn Collins writes, the combination of little support for families, and cultural expectations that workers are fully devoted to their jobs, makes raising a child particularly challenging in America.

There are two important consequences to the lack of public support for childcare in the United States. The first is economic. Mothers experience a “motherhood penalty” in overall lifetime wages when they exit the labor force to provide childcare, or they may be placed on on “mommy tracks” in their professions, with lower-paying and less prestigious jobs that can better accommodate their caring responsibilities. Scholarship shows much smaller motherhood penalties in countries with more cultural and institutional support for childcare.

A second consequence of little support for caring responsibilities is emotional. As Caitlyn Collins writes, mothers in many nations feel guilt and struggle to balance the responsibility to care for their children and their jobs. However, in the United States this guilt and emotional burden is particularly acute because mothers are left almost totally on their own to bear both the practical and moral responsibility for raising children. The guilt parents feel, as well as the stress of balancing childcare responsibilities and full-time work, may be one reason that there is a larger “happiness gap” between parents and non-parents in the United States when compared to other wealthy nations that provide better public support for raising children.

The pandemic has brought a number of social realities into stark relief, including the fact that individual families have to navigate childcare on their own, never clearer than when school closings kept kids at home. As we imagine a post-pandemic future and the potential to “build back better,” we should consider what social research tells us about who should be responsible for caring for kids, the weight of that responsibility, and how public policy changes might provide better care for the nation’s youngest citizens.