culture

Photo of flu shot clinic for veterans. Photo by Maryland GovPics, Flickr CC

After President Trump blamed California state officials for not doing enough to fight and prevent wildfires, civil servants seem to be fed up. Though often understood as emotionless state bureaucrats, frontline workers of the state — from firefighters to social workers — must often deal with suffering, emergencies, and disasters in the everyday operations of their agencies. Social science research helps us understand how state actors manage these roles and maintain their own emotional wellbeing.

The work of state agents entails balancing institutional rules and scarce state resources. Their everyday decisions are thus an essential component of administering and implementing public policy. Because they control the distribution of services, state officials can become policymakers with considerable discretion in the daily implementation of state activities. Their work not only influences how state operations impact citizens’ lives, but it also shapes citizens’ perceptions of state legitimacy.
Workers’ affective lives — their emotional challenges and commitments to institutions — impact the functioning of organizations. Unlike many politicians, scholars, or journalists, state bureaucrats have everyday contact with adversity, social problems, and vulnerable populations. For state officials who interact with the public, working with clients can be emotionally draining and even physically harmful. Civil servants suffer emotional and psychological distress as a result of their daily roles. The consequences of exhausting interactions are harmful to the purposes of the organization, as — in the process of routine operations — bureaucrats may develop special preferences, antipathies, and discrimination against their clients.

State agents thus perform a complex balancing act, both for society and for themselves. Instead of using the stereotype of bureaucrats as vile and insensitive, public policy decisions must also consider the operations of organizational behavior and the struggles of bureaucrats in providing state services.

Photo of a Seminole man holding his child at an American Indian Heritage Month celebration. Photo by Los Angeles District, Flickr CC

After years of debate, the Indian Child Welfare Act (ICWA) — which sets minimum requirements for caseworkers handling state child custody proceedings involving Native children — was recently ruled unconstitutional by a Texas federal judge. The judge argued that ICWA violates constitutional rights to Equal Protection because it “elevates a child’s race over their best interest” — despite the fact that Native children are actually citizens of federally recognized tribes. Social science research helps us understand the historical context necessitating ICWA’s creation, with respect to the problematic history of child removal from Native communities as shaped by racialized, gendered, and cultural ideas.

The ICWA was enacted in 1978, a time when Native children were being removed from their homes and placed in foster care at staggering numbers under the guise of protecting children. At that time, 25-35% of Native children were removed from their homes by state child welfare or by private adoption companies. And the majority (about 85%) of these children were placed outside of their families and communities, even when relatives were willing to take them. Today, despite the minimal protections offered by the ICWA, Minnesota places more Native children in foster care than any other state, making up 20% of children in the system.
The ICWA’s creation and implementation has not only been a response to child-removal through adoption, however. Even earlier, Native children were sent to government or Christian-run boarding schools where teachers forced children to abandon their distinct tribal cultures — they cut Native children’s hair, did not allow them to speak their native languages or participate in cultural practices, and enforced strict discipline through corporal punishment. The boarding school era prevented generations of Native people from learning (and passing on) parenting tools. This separation of families, along with the disruption to Native cultural and spiritual practices, has been linked to symptoms similar to post-traumatic stress disorder, increased exposure to physical violence, and substance abuse in Native communities.
The removal of Native children is also couched in deep-set racialized, gendered, and cultural notions of family, specifically the white middle class ideal of the nuclear family, characterized by two married parents and children. Conversely, non-Native supporters of these adoption practices often relied on stereotypes of Native women as sexualized, unmarried, and thus unfit, which pathologized Native families as neglectful. They have also argued that each child’s best interests should be considered on an individual basis, rathering than acknowledging what tribes see as the importance of culture and identity, tribal rights, and belonging. In other words, supporters of Native adoption saw “disadvantaged” Native children that needed to be “rescued” by individual acts of goodwill (from white, middle class Americans).

So what will legal reconsideration of the Indian Child Welfare Act bring? Many tribes fear that the Texas ruling sets a dangerous precedent that could dismantle the federal laws put in place to correct historical injustices like the boarding school system. Other tribal leaders see the ruling as an attempt to destroy their right to political and cultural survival through their children, while simultaneously compromising efforts to heal from the wrongdoings inflicted upon tribal communities. In the context of the current political division over the treatment of immigrant children separated from their parents at the U.S. border, such concerns warrant serious attention.

Photo by Tom Lee, Flickr CC

Originally posted October 18. 2017

If you like Halloween, you know that witches are a popular costume choice and decoration this time of year. But the history of witches involves much more than bubbling cauldrons and flying broomsticks. Social science shows us that witchcraft has a long history of empowering marginalized groups, like women and sexual minorities, who question more traditional religious practices.

While popular images of witches often focus on magic spells, brooms, and pointed hats, witchcraft and other forms of neo-paganism have historically been used by women to push back against male-dominated religions. More traditional, hierarchical religions like Christianity and Islam often place women in a subordinate role to men, and research finds that many women are drawn to witchcraft and other alternative spiritualities because they emphasize female empowerment, embodied rituals, and sexual freedom.
People who practice witchcraft and neo-paganism typically see sexuality and gender as key sites for social transformation and personal healing, pushing back against the Christian idea that sex and bodies are sinful. Since neo-paganism values sexual freedom and sexual diversity, LGBTQ folks and people practicing polyamory often feel a sense of belonging that they don’t find in other religious spaces.
This has also been true for young adults. In general, young adults practice religion and spirituality differently than do older generations. For example, millennials are the least likely to participate in traditional religious institutions or identify with one single religious belief system, but many still desire some combination of spirituality and community. The increase in portrayals of witchcraft and other neo-pagan religions in popular media has exposed younger generations to these communities, and research finds that teens are more often drawn to these alternative spiritual practices as a means of self-discovery and community, rather than the promise of magical powers.
Photo of a protester holding a sign that says, “Stop deportation! I need my daddy,” with a picture of a child. Photo by Fibonacci Blue, Flickr CC

This post was created in collaboration with the Minnesota Journalism Center

A Way Out of the Immigration Crisis.”Like it or Not, Immigrant Children are Our Future.” “Trump Talks Tough on Immigration in Nevada but it Could Backfire.

All three of these news headlines were published by mainstream news outlets in the United States in September 2018. All three portray immigration in a negative light. While much news and media coverage across the globe portrays immigration negatively, researchers who study this area also identify media outlets that include more positive coverage. 

One thing scholars have found is that immigration coverage has increased rapidly since 2004. In U.S. coverage, there is a heavy focus on “immigration reform” and “tougher border control” as solutions to the immigration “problem.” Across the pond in the United Kingdom, polls have ranked immigration as one of citizens’ top concerns, with coverage of immigration being overwhelmingly negative and centered on conflict through utilization of crime frames and discussion of immigrants as “illegal” or “failed.”
But not all publications discuss immigration in the same way. For example, scholars have found that African-American media outlets publish immigration stories with racial frames that depict immigrants as allies, while other major mainstream media outlets frequently utilize crime frames that discuss the legal status of immigrants.
Research also shows suggests that the conflict-oriented news coverage of immigration contributes to polarized public opinions about immigration policies in digital spaces as well as comment sections of mainstream news sites. Conversely, social and mobile media in Australia has been used as a platform for asylum-seekers in detention to share their experiences with the public directly. Journalists and citizens alike have collaborated with detained asylum-seekers to create journalistic narratives that raise awareness of human rights violations unfolding in offshore detainment centers. Though not embraced by all consumers, such stories about immigration appear to cultivate expressions of empathy from audiences.
Photo by PictureCapital, Flickr CC

People invent words and definitions to help navigate the world around them. Once created, such labels can have monumental impacts. The word “genocide” is one example of a term that holds meaning for victims, perpetrators, and those who watch violence unfold. Over the past few years, debates have raged as to whether or not to call the Burmese state’s violence against the Rohingya “genocide.” Such debates often privilege the label, rather than focusing on the everyday violence experienced by civilians. Sociologists seek to understand the meaning, use, and consequences of labels like genocide.

Individuals and groups use institutions to construct consensus about labels. But people use terms in different ways, and labels often change in meaning over time. The term “genocide” was first coined by a lawyer, Raphael Lemkin, in the aftermath of the Holocaust. The United Nations adopted this term and formalized genocide as a crime in 1951, but the meaning of genocide continues to be contested. Some academics, for example, advocate for the inclusion of political groups as targets for genocide in addition to collectivities that are already included, like ethnic or religious groups.
Debates about labels have real effects. In the case of genocide, such implications are most directly felt by populations affected by violence. Victims can feel that their loss is recognized and mourned when appropriate labels are used, while an insufficient label may promote impunity for past crimes. As some theorists argue that an acknowledgement of past wrongdoing is central to healing, the use of fitting labels takes on an even more practical importance.
Despite these important considerations, some scholars and activists express concern that focusing on labels does more harm than good. The label of “genocide,” though critically important to survivors and advocates, does not come with legal obligations to intervene. While policymakers and activists discuss the relevance of the term “genocide” in Burma, atrocity crimes continue to unfold. As such, some scholars argue that the social importance of labels can distract from the immediate needs of victims of violence. From this vantage point, scholarly and advocacy attention is best directed towards serving the needs of those impacted by violence.
Photo of Indigenous Women, some holding children, outside of a Church in Chiapas, Mexico. Photo by Adam Jones, Flickr CC

More and more Americans have begun observing Indigenous Peoples Day, at least in part to push back against national narratives of “discovery” associated with Christopher Columbus and his commemoration. While a relatively recent development in the United States, other nations of the Americas officially acknowledged the importance of their Indigenous heritage for much longer. For example, in Mexico, Día de la Raza or “The Day of the Race” was officially recognized back in 1928 and was part of a larger national project that emphasized that all Mexicans share a history of racial and cultural mixing — known as mestizaje — since the coming of the Spanish. Sociological research highlights how this recognition of Indigenous people as integral to the formation of the nation has actually had mixed consequences for Indigenous peoples and cultures of Mexico.

The notion of mestizaje emphasized that all Mexicans were fundamentally “mixed” individuals, or “mestizos.” It was adopted by the State in an effort to promote inclusion and national cohesion across racial lines — including Indigenous peoples — and even resulted in the removal of racial categories from the census (after 1921). In this spirit, the Mexican government sought to “improve” Indigenous individuals through education, social integration, and economic development, assimilating them into the newly defined mestizo nation. While some benefited, some lost their language and cultural identity, and many others, especially those with darker skin, faced further marginalization and found themselves pushed underground.
Due to internal and external political pressures in the 1990s, the Mexican government abandoned its assimilationist policies and began instead to protect and promote the languages and cultures of Indigenous peoples. These shifts appear to have contributed to greater ethnic pride and greater likelihood to self-identify as Indigenous, especially for more educated, wealthier, or urban populations. However, skin color continues to carry economic and social weight in Mexico. Non-white Mexicans tend to have lower levels of education and employment and are more likely to live in poverty than their lighter-skinned peers.

So, while Mexico may still celebrate a day to acknowledge its mixed racial heritage, it is worth wondering if there might be other, better ways to recognize and address the challenges that actual Indigenous people in the country face on a day to day basis.

Photo by Becky Stern, Flickr CC

The newest Apple Watch can now warn users when it detects an abnormal heartbeat. While Apple may be on the cutting edge, many people have been using apps to track their food intake and exercise for some time. Social science research demonstrates that health-tracking technology reflects larger social forces and institutions.

These health-tracking apps are part of a larger trend in American medicine that researchers call “biomedicalization,” which includes a greater focus on health (as opposed to illness), risk management, surveillance, and includes a variety of technological advances.
Benefits of using these apps include empowering patients and not having to rely on doctors for knowledge about one’s body, which — as many of the apps advertise — may save time and money by potentially allowing them to avoid doctor visits. However, self-tracking may put more onus on the individual to maintain their health on their own, leading to blame for those who do not take advantage of this technology. Further, using this technology could lead to strain in doctor-patient relationships if doctors believe patients are undermining their authority.

As more and more Americans use smartphones, the promise of digital technology, including health-tracking apps, for reducing existing health disparities grows. However, the Pew Research Center shows large income and educational gaps still exist in smartphone use, meaning the health benefits of using such technology — as well as potential downfalls — for the greater population, may be a long way off.

Photo by Jeffrey, Flickr CC

This post was created in collaboration with the Minnesota Journalism Center

From FiveThirtyEight to the front page of the local paper, data journalism is on the rise at media outlets worldwide. As early as 2012, Columbia Journalism Review published reports featuring examples of local and regional outlets beginning to publish stories with graphs, charts, and visualizations online. In the case of New York City news media in particular, data, analytic, and platform-based positions now account for nine percent of all media jobs — marking considerable growth since 2010. Studies also show that, in today’s journalistic job market, entry-level journalists are often expected to have skills in data journalism, social media, and analytics in addition to traditional reporting and editing skills. Social science research shows how social forces contribute to this shift. 

Legacy media organizations including the Los Angeles Times and the Washington Post produce news at breakneck speed in a 24/7 news cycle, and are constantly innovating to find the most profitable and efficient methods to distribute news to the public. Not only are online media able to host the results and illustrations of large data analysis, but some media outlets utilize computational and/or algorithmic processes and programs that automatically convert batches of data into news stories for publication.
The rise of data journalism is also self-reinforcing: As data becomes a central fixture in newsrooms worldwide, higher education institutions are developing programs to train journalists in data, analytics and programming. Established data journalism programs at higher education institutions include Columbia Journalism School and the University of Missouri’s M.S. in Data Science & Analytics, while other institutions offer data journalism courses taught by part-time, adjunct, and/or visiting instructors who specialize in the field. Not surprisingly, social scientists are beginning to track and analyze these programs.
Sign in a store that says “We Accept SNAP.” Photo by ajmexico, Flickr CC

Recently, Trump advisor Stephen Miller announced plans to bar documented immigrants from citizenship if they or their families have ever used social assistance programs such as food stamps or welfare. Such action reflects stereotypes about who uses social assistance — in the United States, people of color take the blame. Not only are these stereotypes often incorrect, they are also deeply rooted in a long history of race and racism in America.

It is important to understand that racial minorities and immigrants do not necessarily use more public resources than native-born whites. Racial minorities and immigrants do tend to have lower incomes and levels of education than native-born whites, but research shows that they do not excessively use social assistance programs when compared with other groups.
Americans’ attitudes towards welfare — particularly myths that certain groups overuse programs such as welfare and food stamps — are heavily rooted in politics of race and racism. In fact, several scholars have illustrated how political and ideological opposition to social spending are shaped by racial appeals. Even in the post Civil Rights era, political figures use implicit messaging and coded language to attack social spending programs and recipients of these programs, subtly implying racial minorities overuse such programs, thus perpetuating these racist narratives.
Miller’s plan to bar citizenship for immigrants who have used social spending programs must also be understood as a consequence of historical racism in the American welfare state. During the 19th and 20th centuries, white working-class immigrants from a variety of European countries accessed social spending programs, opportunities for home ownership, and union membership due to their racial privilege.  On the other hand, Blacks and other non-white groups — including non-white immigrants– were denied the same opportunities. This heightened racial inequality while simultaneously validating racist beliefs about minorities and immigrants. In short, while Miller’s plan seems to primarily focus on immigration, it most certainly also about race.
Photo by Steven Depolo, Flickr CC
Photo by Steven Depolo, Flickr CC

Originally Posted September 14, 2016.

It’s September, which means students are zipping up their backpacks and sharpening their pencils for a new school year. For many kids, however, disciplinary actions like suspension and detention make school feel less like a place of learning and more like a minefield for getting into trouble. Some schools are experimenting with restorative justice practices to address disruptive behavior in lieu of more traditional means that often mean missing class. These new policies tend to take a lot of time and effort to implement, and very little research has been done regarding the efficacy of these restorative justice initiatives. However, research points to an array of problems with the more traditional, exclusionary methods educators have relied on in the past. 

Many schools have increased their use of punitive discipline and zero tolerance policies, despite drops in school-based delinquency. A shift in school disciplinary procedures would likely result in fewer days of missed tests and lectures for African American students, who are the most likely to receive suspension as a punishment in schools, especially in more racially diverse schools. Research shows that black students are more likely to receive the brunt of disciplinary action when overall delinquent behavior in school is low because teachers and administrators perceive them as threatening day-to-day proceedings.  
Educators often evaluate certain behaviors and mannerisms like punctuality, quiet voices, and particular styles of dress as indicative of being good students. These perceptions of good behavior often stem from teachers’ raced and classed biases regarding what a model student looks like. But many of the characteristics that teachers think make bad apples, like tardiness and attendance inconsistencies, are in fact the same red flags that a student is at risk of dropping out of school. And new research finds that exclusionary punishments like detention and suspension lead to lower test scores and increased tensions between teachers and students.