We see the side of a person, a police radio and handcuffs lopped onto a belt. They are wearing a blue shirt and blue pants. Image used via CC0.

Complaint Process
In recent years, many initiatives have worked to systematically track and analyze data on police complaints in jurisdictions such as Chicago. However, obtaining accurate data on police is notoriously difficult, because the primary mechanism for oversight is often “internal affairs” – the police themselves.  In other words, if someone wanted to voice their grievance they are often required to make the complaint to the very organization that harmed them – an obvious conflict of interest.

When complaints are made, very few are “sustained” or deemed valid by colleagues of the police officer. Social scientists have found that between 2% – 28% of complaints are actually sustained, which might well be an overestimate. Moreover, complaints by Black citizens are even less likely to be sustained.

Bad Apples?

Is the solution as simple as removing “bad apples” with numerous police complaints from the police force? As is common when society faces a difficult problem, we tend to gravitate towards easy solutions – such as scapegoating. Research suggests that a small portion of officers (4% – 12%) were responsible for a relatively large share (20% – 41%) of filed complaints. Yet the majority of complaints are spread throughout the department. In other words, there are not just a few bad apples spoiling the bunch – but the tree itself may be bearing rotten fruit

Systemic Change

In recent decades, police departments have adopted initiatives, such as civilian review boards, which foster greater inclusion of the community into addressing complaints. However, these initiatives have mixed results and have been criticized for their exclusion of racially marginalized community members.

Beyond civilian review boards, cities such as Baltimore, Los Angeles, New Orleans, New York, and Denver have taken action to hold spaces for direct, face-to-face dialogue between complainants and police. Both traditional and restorative justice models of mediation have led to greater satisfaction, in-tune with the spirit of “community-policing” and fostering healing. 

As is the case with controlling crime more generally, this research shows that the problem is not as simple as identifying and tossing out a few bad apples – and that police, policy makers, and the community must look to system-level change rather than placing the entirety of blame on individual scapegoats.

A man sits in front of a document, cup of coffee, and laptop, his head resting in his hands. Sunlight streams through a window to the left. Image used under CC0.

Today “help wanted” signs are commonplace; restaurants, shops, and cafes have temporarily closed or have cut back on hours due to staffing shortages. “Nobody wants to work,” the message goes. Some businesses now offer higher wages, benefits, and other incentives to draw in low-wage workers. All the same, “the great resignation” has been met with alarm across the country, from the halls of Congress to the ivory tower.

In America, where work is seen as virtuous, widespread resignations are certainly surprising.  How does so many are walking away from their jobs differ from what we’ve observed in the past, particularly in terms of frustrations about labor instability, declining benefits, and job insecurity? Sociological research on work, precarity, expectations, and emotions provides cultural context on the specificity and significance of “the great resignation.”

Individualism and Work

The importance of individualism in American culture is clear in the workplace. Unlike after World War II, when strong labor unions and a broad safety net ensured reliable work and ample benefits (for mostly white workers), instability and precarity are hallmarks of today’s workplace. A pro-work, individualist ethos values individual’s flexibility, adaptability, and “hustle.” When workers are laid off due to shifting market forces and the profit motives of corporate executives, workers internalize the blame. Instead of blaming executives for prioritizing stock prices over workers, or organizing to demand more job security, the cultural emphasis on individual responsibility encourages workers to devote their energy into improving themselves and making themselves more attractive for the jobs that are available.

Expectations and Experiences

For many, the pandemic offered a brief glimpse into a different world of work with healthier work-life balance and temporary (if meaningful) government assistance. Why and how have American workers come to expect unpredictable work conditions and meager benefits? The bipartisan, neoliberal consensus that took hold in the latter part of the twentieth century saw a reduction in government intervention into the social sphere. At the same time, a bipartisan pro-business political agenda reshaped how workers thought of themselves and their employers. Workers became individualistic actors or “companies of one” who looked out for themselves and their own interests instead of fighting for improved conditions. Today’s “precariat” – the broad class of workers facing unstable and precarious work – weather instability by expecting little from employers or the government while demanding more of themselves.

Generational Changes

Researchers have identified generational differences in expectations of work. Survey data shows that Baby Boomers experience greater difficulty with workplace instability and the emerging individualist ethos. On the other hand, younger generations – more accustomed to this precarity – manage the tumult with greater skill. These generational disparities in how insecurity is perceived have real implications for worker well-being and family dynamics.

Emotions

Scholars have also examined the central role emotions play in setting expectations of work and employers, as well as the broad realm of “emotional management” industries that help make uncertainty bearable for workers. Instead of improving workplace conditions for workers, these “emotional management” industries provide “self-care” resources that put the burden of managing the despair and anxiety of employment uncertainty on employees themselves, rather than companies.

Image: a young white boy faces the camera, held in the arms of a person whose face we cannot see. Image license CC0.

The impact of COVID-19 on parents and children has forced us to reconsider how the U.S. approaches traditional welfare supports. A major change that parents saw in July 2021 under the American Rescue Plan Act (ARPA) was the increase in value of their child tax credit (CTC) and a monthly payout of half that child CTC – with $300 paid for each child under 6 years and $250 paid for each child 6-17 years each month. Furthermore, the threshold for receiving the CTC was considerably raised – temporarily lifting millions of children above the poverty line. ‘Incrementally revolutionary’ for social welfare in the U.S., the extension and expansion of the CTC hads the potential to strengthen the social safety net and have a broad social impact.  Now that expansions to the CTC have rolled back, what do we know about CTC and how a more permanent expansion could support families?

Passed into law with bipartisan support in 1997, the CTC originally served as a tax break to middle class taxpayers. In 2001 and then 2008 the CTC was then made refundable and more accessible to lower income families.  Since the passage of the ARPA in 2021, the CTC is now more accessible and relatively generous than many other forms of welfare.

In measuring the social impact of the CTC, researchers have published ample evidence of this worthwhile investment. A nation-wide study found that when parents received the CTC their children were less likely to be physically injured and had less behavioral problems. Because children living in poverty are up to nine times more likely to fall victim to maltreatment and suffer from poor overall health, the CTC provides additional economic stability to lower-income parents. 

International programs similar to the CTC have found that increased payments were associated with lower levels of ADHD, physical aggression, maternal depression, and better emotional/anxiety scores among children. Experts in the U.S. have predicted that an increased investment in the CTC would have similar individual and social health impacts, remove millions of impoverished children out of poverty, and save billions of dollars in future. 

Today, with COVID-19 spurring conversations and the realization that U.S. welfare is in need of an update, policy makers have a “charcuterie board” of welfare reform choices.  Of the more savory variety there are work-oriented programs which would moderately decrease poverty and decrease unemployment.  Then there are some sweeter options that would dramatically reduce poverty, but increase unemployment. Arraying these options, a nationwide, interdisciplinary committee of experts have made four recommendations based on changes in unemployment and child poverty.  Regardless of different policy member’s palate preferences, increasing the CTC would both decrease poverty among families by over 9% and decrease unemployment by over half a million jobs – a sweet and savory option. 

On December 15th, 2021, the monthly CTC payments directed to parents expired.  In other words, parents in dire straits are no longer receiving necessary financial support.  Congressional debate on the Build Back Better bill (BBB), which could extend the CTC, provide universal pre-K education, national paid leave for caregiving or illness, and other social investments, has languished. However, for a brief period, we saw evidence of the power of expansion of welfare provisions like the CTC.

Originally posted February 13, 2020.

Over one million people will get engaged on Valentine’s Day, and as a result, diamond sales usually uptick around this time. Diamonds are both historical and cultural objects; they carry meaning for many — symbolizing love, commitment, and prestige. Diamonds are highly coveted objects, and scholars have found about 90 percent of American women own at least one diamond. In the 1990s, war spread throughout West Africa over these precious pieces of carbon, as armed political groups vied for control over diamond mines and their profits.

Given their role in financing brutal West African civil wars, diamonds became associated with violence and international refugee crises, rather than financial prosperity and love. Diamonds became pejoratively known as blood diamonds, or conflict diamonds, and consumers became more likely to perceive diamonds as the result of large scale violence and rape.  As a result, major diamond producers have attempted to reconstruct the symbolic meaning of diamonds, turning them into symbols of international development and hope.
As the diamond trade became immoral and socially unjust, new global norms emerged around corporate and consumer responsibility. Non-governmental organizations (NGOs) lobbied for the diamond industry to change their behaviors and support of conflict mines while simultaneously creating new global norms and expectations. In the early 2000s, international NGOs, governments and the diamond industry came together to develop the Kimberley Process — to stop the trade of conflict diamonds. Today, 75 countries participate, accounting for 99% of the global diamond trade. 
Bieri & Boli argue that when NGOs urge companies to employ social responsibility in their commercial practice, they are mobilizing a global moral order. Diamonds provide an example of how symbols, products, and meaning are socially and historically constructed and how this meaning can change over time. The case of blood diamonds also illustrates how changing global norms about what is and is not acceptable can redefine the expectations of how industries conduct business.
Members of a trade union on strike in Syndey, Australia. Image courtesy of Stilgherrian, CC BY 2.0.

Workers in the United States are experiencing a growing number of strikes across the country.  Record numbers of job openings, employee departures, and desperation among employers across sectors are empowering workers to push for change. But how are strikes today different from those in the recent past? And what predictions might research forecast?

Economic strikes, when  workers withhold their labor to pressure employers to increase their pay or working conditions, are risky for workers. Employers hold the right to permanently replace striking workers and “strike-breakers” (people hired to replace strikers) often gain legal protections in their new positions.  However, strikers today seem to have the wind at their back. According to the Bureau of Labor Statistics, in recent months the U.S. has seen the highest rates of worker “quits” in decades.  Furthermore, research suggests that workers are choosing to not return to work, even as COVID-19 unemployment benefits are reduced or eliminated.  

Research on the nature of contemporary strikes has shown that they have been largely defensive, where workers were pushed to the breaking point and striked reactively.  Alternatively, offensive strikes arise during more opportunistic climates and are initiated by workers.  Under these opportune conditions with dwindling labor competition, workers gain some degree of leverage at the bargaining table with management.  

Sociological research has tracked union membership and its effects on inequality. For example, Western and Rosenfeld report that between 1973 and 2007, US private sector union membership fell from 34 to 8 percent for men and from 16 to 6 percent for women. Numerous studies have tied this decline in unionization to wage inequality and earnings instability.

In recent years, unions have increasingly engaged with coalitions and/or community groups interested in social change. By organizing with other groups, workers connect and create networks that address mutual concerns.  Social issues such as fair wages, organizational policies, and the exportation of jobs are then materialized and humanized during strikes – giving a platform for societal discussion for these social issues.  This empowerment through favorable conditions, paired with a heightened cooperation with social change coalitions may be forming an impending, perfect storm for worker-initiated strikes. 

In today’s era of a globalized workforce, ongoing public health crises, social media, and strike activity, another wave of social change may be in the air.

A mother holds an infant in front of a set of curtains. The room is dark but there is light and the shadows of trees beyond the curtains. Image via pixabay, Pixabay License.

The new Netflix show, Maid, based on the best-selling memoir by Stephanie Land, chronicles a mother’s journey out of domestic violence and towards safety. The story offers an intimate portrait of the many barriers facing impoverished mothers, including the never-ending obstacles in securing government assistance.

Sociological research has consistently found that the welfare system inadequately serves the poor. From red tape to contradictory policies, accessing government assistance is notoriously difficult to navigate. Further, welfare is highly stigmatized in the United States with shame and coercion baked into its process. 

Due to gendered expectations of parenting, mothers face increased scrutiny about their children’s well being. In particular, mothers of low socioeconomic status are often harshly judged for their parenting without consideration of the structural inequities they face. Mothers seeking assistance from the welfare system are often judged because of cultural stereotypes about motherhood, poverty, and government assistance.  

The U.S. welfare system has been a contentious subject for decades with public perceptions of poverty influencing the social safety net. The derogatory infamous image of the “welfare queen” – an allegedly lazy or irresponsible woman who exploits government programs – demonstrates how racist images of poverty and motherhood directly impacted policy making. This body of work takes a historical perspective on welfare and motherhood to consider how gender and racial stereotypes influence public policies. 

Much research directly contradicts the welfare queen trope, showing instead how impoverished families have fallen through the cracks of the welfare system. This work  highlights the astounding income inequality in the contemporary United States and the resourcefulness and resiliency of impoverished families and individuals and their struggle to survive on little-to-no resources. 

This image has an empty alt attribute; its file name is When-Trauma-is-Passed-Down-TROT-Image-600x400.jpg
Image description: Mohammed, a Somali exile, sits in a chair on the right-hand side of the image. his children sit on the floor around him, as they discuss art. Art covers the wall. Creating cultural products is one way that communities process trauma. Image courtesy of UNHCR, CC BY-NC 2.0.

Originally published April 12, 2021

Scientific developments in the field of epigenetics have called attention to intergenerational transfers of trauma. We know now that traumatic experiences can be passed down through the genes— to children, and even grandchildren, of the survivors of horrific experiences like the Holocaust or American slavery. Sociology can help show how past trauma is passed down through social ties, and about its effects on current health and wellbeing. These social consequences of trauma could be even more powerful than the genetic impacts, affecting group dynamics, identity, history, and culture. In addition to what is passed down, sociological research also provides examples of how groups are managing these social effects, in both helpful and harmful ways. 

Cultural Trauma and Group Identity
Cultural sociologists assert that in addition to individual bodily and psychiatric trauma, there is also collective “cultural trauma” when groups experience horrific events. This collective trauma compounds and complicates individual effects. In order for the process of cultural trauma to occur, the group must first recognize that a great evil has been done to them, and construct a cohesive and shared narrative that includes perpetration and victimhood. Then this narrative becomes incorporated into that group’s collective memory as an enduring aspect of their identity, like the Holocaust has been for Jews or collective memory of slavery for Black Americans.
Both perpetrators and victims of violence must contend with the horrific event in some way, as it is now permanently associated with their group. This can occur either through avoidance of the difficult past, or stigma management practices like acknowledgment, denial, and silencing.

Cultural Trauma and Group Conflict: Violence Begets Violence

Sometimes, this cultural trauma process results in further violence. As the group comes to understand the harms they have suffered and assign responsibility, they can seek violent retaliation against the offending perpetrators. Examples include the bombing of Pearl Harbor (and subsequent Japanese internment and Hiroshima/Nagasaki bombings), and the 9/11 attacks leading to the U.S. War on Terror. In ex-Yugoslavia, ancient collective memories were stoked and reconstructed by elites to provoke inter-ethnic violence that led to ten years of war, genocide, and ethnic cleansing. In Hawai’i, Irwin and Umemoto trace the emotional and psychological effects of violent colonial subjugation, such as distress, outrage, and depression, to contemporary violence among Pacific Islander youth.

Memory Work: Social Solidarity and Empowerment

Sociological research also provides examples of people “making sense” of difficult pasts by doing “memory work,” which can include art, music, and other cultural production. For example, second-generation Sikhs in the U.S. are using internet spaces to challenge dominant narratives of the 1984 anti-Sikh violence in India, contributing to group solidarity, resilience, and identity within their communities here in the U.S. Similarly, the children of Vietnamese refugees are using graphic novels and hip-hop music to articulate how the Vietnam War contributes to current struggles in the Vietnamese community. This shared understanding and validation then empower communities to fight for recognition and social justice. 

When a group experiences a horrific event, the social effects live on to future generations. Understanding these effects is crucial for developing solutions to group suffering moving forward. Going through the cultural trauma process is necessary to overcome difficult pasts, but it is critical that this process occurs in a way that promotes justice and peace rather than further violence.

Video imagery courtesy of canva, canva free media usage.

Originally posted on March 16, 2017

The United States and the United Nations have had a closely intertwined relationship since the organization’s founding in 1945. The UN deals with a broad range of issues around the globe, and its widespread influence is often controversial. However, the influence of the United Nation continues to be instrumental in promoting crucial human rights causes, and the reach of its aid is arguably beyond compare. Despite its numerous shortcomings, the UN plays a crucial role in promoting human rights norms across the globe.

Throughout the 1990s in particular, the United Nations took on a central role in the global justice process. It organized and funded international courts following episodes of mass violence, such as the International Criminal Tribunal for Rwanda, and it made indictments for egregious crimes possible for the first time (including the crime of genocide).  Sociologists find that the existence of these courts have a global impact in providing justice, and the trials seem to have a positive effect in reducing human rights violations in the long run.
The judicial process alone cannot adequately address global human rights issues — humanitarianism and diplomacy also play key roles. The United Nation arguably plays the most dominant global role in these initiatives, with monumental campaigns addressing topics like hunger, refugee needs, and climate change. The UN has been criticized for showcasing Western ideals and not taking into account cultural contexts, such as early endeavors to reduce female genital cutting. However, the UN has made improvements and when programs are approached as an opportunity for partnership and not dominance, the outcomes can be quite positive. For example, the agency has taken great strides in promoting gender equality and access to education.
Video courtesy of canva, canva free media usage.

Originally posted November 14, 2019

As we prepare for Thanksgiving, many people look forward to sharing a warm meal with their family and friends. Others dread the holiday, gearing up to argue with their relatives or answer nosey questions. TSP has written about the political minefield that holiday meals can be in the past. This year we want to point out that the roots of difficult dinners actually run deep in everyday family mealtime. Thanksgiving, like any family mealtime, has the potential for conflict. 

Scholars have documented how important meal time can be for families in terms of cultivating relationships and family intimacy. However, they also show that despite widespread belief that families should share “happy meals” together, meals can be emotionally painful and difficult for some families and family members.
Disagreements between parents and children arise at mealtime, in part, because of the meal itself. Some caregivers go to battle with “picky eaters.” Migrant parents struggle to pass cultural food traditions to children born in the United States. Low income parents worry that their children will not like or eat the food they can afford.
Family meals also reproduce conflict between heterosexual partners. Buying, preparing, and serving food are important ways that women fulfill gendered expectations. At family meal-times men continue to do less work but hold more power about how and when dinner is served.
Thanksgiving, or any big holiday meal, can involve disagreements. However, that is not altogether surprising considering that everyday family meals are full of conflicts and tension.

Image: A little white girl sits on an adult’s lap in front of a laptop, her small hands covering the adults as they use the computer. Image courtesy of Nenad Stojkovic CC BY 2.0

Democrats in Congress continue toward passing sweeping infrastructure legislation. Part of the infrastructure packages would provide funding for childcare including universal pre-K for three and four-year-olds, aid for working families to pay for the costs of daycare, and paid family leave. Social science research helps place this current debate in perspective, connecting it to larger conversations about who is responsible for paying the costs of raising kids, the consequences for families of the private responsibility for childcare, and what international comparison can show us about alternatives. 

Part of the question concerns whether we should think of raising children as a social, rather than individual, responsibility. Public investments in childcare, whether through public assistance to cover the cost of childcare or a public system of universal childcare, are one way that countries communicate who is responsible for reproductive labor: the work of having and caring for children. In the United States, this is often thought of as the responsibility of individual families and, historically, mothers. Feminist scholars, in particular, have critiqued the individualization of responsibility for raising children, emphasizing that the work of having and raising children benefits society across the board. Having kids creates the next generation of workers and tax-payers, carrying on both practical and cultural legacies. Scholars argue that because we all benefit from the work of reproducing the population we should all share its costs and responsibilities.

Other wealthy Western nations handle childcare differently. For instance, in Sweden there is subsidized childcare available for all children that is considered high quality and is widely utilized. In Germany, there is greater availability of well-paying part-time jobs that can enable two-parent households to better balance the responsibilities of work with the demands of raising kids. In the United States, there is now virtually no public support for childcare. Parents are left to their own devices to figure out how to cover the time before the start of public school at age five as well as childcare for before or after school, and during school vacations. The U.S. is not alone in expecting families to provide childcare, for instance, Italy has a culture of “familialism” that expects extended family and, in particular, grandparents to provide free childcare for working families. However, as Caitlyn Collins writes, the combination of little support for families, and cultural expectations that workers are fully devoted to their jobs, makes raising a child particularly challenging in America.

There are two important consequences to the lack of public support for childcare in the United States. The first is economic. Mothers experience a “motherhood penalty” in overall lifetime wages when they exit the labor force to provide childcare, or they may be placed on on “mommy tracks” in their professions, with lower-paying and less prestigious jobs that can better accommodate their caring responsibilities. Scholarship shows much smaller motherhood penalties in countries with more cultural and institutional support for childcare.

A second consequence of little support for caring responsibilities is emotional. As Caitlyn Collins writes, mothers in many nations feel guilt and struggle to balance the responsibility to care for their children and their jobs. However, in the United States this guilt and emotional burden is particularly acute because mothers are left almost totally on their own to bear both the practical and moral responsibility for raising children. The guilt parents feel, as well as the stress of balancing childcare responsibilities and full-time work, may be one reason that there is a larger “happiness gap” between parents and non-parents in the United States when compared to other wealthy nations that provide better public support for raising children.

The pandemic has brought a number of social realities into stark relief, including the fact that individual families have to navigate childcare on their own, never clearer than when school closings kept kids at home. As we imagine a post-pandemic future and the potential to “build back better,” we should consider what social research tells us about who should be responsible for caring for kids, the weight of that responsibility, and how public policy changes might provide better care for the nation’s youngest citizens.