Photo by PictureCapital, Flickr CC

People invent words and definitions to help navigate the world around them. Once created, such labels can have monumental impacts. The word “genocide” is one example of a term that holds meaning for victims, perpetrators, and those who watch violence unfold. Over the past few years, debates have raged as to whether or not to call the Burmese state’s violence against the Rohingya “genocide.” Such debates often privilege the label, rather than focusing on the everyday violence experienced by civilians. Sociologists seek to understand the meaning, use, and consequences of labels like genocide.

Individuals and groups use institutions to construct consensus about labels. But people use terms in different ways, and labels often change in meaning over time. The term “genocide” was first coined by a lawyer, Raphael Lemkin, in the aftermath of the Holocaust. The United Nations adopted this term and formalized genocide as a crime in 1951, but the meaning of genocide continues to be contested. Some academics, for example, advocate for the inclusion of political groups as targets for genocide in addition to collectivities that are already included, like ethnic or religious groups.
Debates about labels have real effects. In the case of genocide, such implications are most directly felt by populations affected by violence. Victims can feel that their loss is recognized and mourned when appropriate labels are used, while an insufficient label may promote impunity for past crimes. As some theorists argue that an acknowledgement of past wrongdoing is central to healing, the use of fitting labels takes on an even more practical importance.
Despite these important considerations, some scholars and activists express concern that focusing on labels does more harm than good. The label of “genocide,” though critically important to survivors and advocates, does not come with legal obligations to intervene. While policymakers and activists discuss the relevance of the term “genocide” in Burma, atrocity crimes continue to unfold. As such, some scholars argue that the social importance of labels can distract from the immediate needs of victims of violence. From this vantage point, scholarly and advocacy attention is best directed towards serving the needs of those impacted by violence.
Photo of Indigenous Women, some holding children, outside of a Church in Chiapas, Mexico. Photo by Adam Jones, Flickr CC

More and more Americans have begun observing Indigenous Peoples Day, at least in part to push back against national narratives of “discovery” associated with Christopher Columbus and his commemoration. While a relatively recent development in the United States, other nations of the Americas officially acknowledged the importance of their Indigenous heritage for much longer. For example, in Mexico, Día de la Raza or “The Day of the Race” was officially recognized back in 1928 and was part of a larger national project that emphasized that all Mexicans share a history of racial and cultural mixing — known as mestizaje — since the coming of the Spanish. Sociological research highlights how this recognition of Indigenous people as integral to the formation of the nation has actually had mixed consequences for Indigenous peoples and cultures of Mexico.

The notion of mestizaje emphasized that all Mexicans were fundamentally “mixed” individuals, or “mestizos.” It was adopted by the State in an effort to promote inclusion and national cohesion across racial lines — including Indigenous peoples — and even resulted in the removal of racial categories from the census (after 1921). In this spirit, the Mexican government sought to “improve” Indigenous individuals through education, social integration, and economic development, assimilating them into the newly defined mestizo nation. While some benefited, some lost their language and cultural identity, and many others, especially those with darker skin, faced further marginalization and found themselves pushed underground.
Due to internal and external political pressures in the 1990s, the Mexican government abandoned its assimilationist policies and began instead to protect and promote the languages and cultures of Indigenous peoples. These shifts appear to have contributed to greater ethnic pride and greater likelihood to self-identify as Indigenous, especially for more educated, wealthier, or urban populations. However, skin color continues to carry economic and social weight in Mexico. Non-white Mexicans tend to have lower levels of education and employment and are more likely to live in poverty than their lighter-skinned peers.

So, while Mexico may still celebrate a day to acknowledge its mixed racial heritage, it is worth wondering if there might be other, better ways to recognize and address the challenges that actual Indigenous people in the country face on a day to day basis.

Protest calling to remove Fort Snelling in Minnesota. Photo by Fibonacci Blue, Flickr CC

In recent months, a homeless encampment of over 300 people — most of whom are American Indian — has formed along a highway noise wall in Minneapolis. The encampment has been self-proclaimed the “Wall of Forgotten Natives” by residents and Indigenous activists who point out that much of Minneapolis is built on stolen Dakota land. Social and health service providers have mobilized around the encampment, and city officials have worked with community leaders to begin a relocation of people at the encampment to more stable housing on Red Lake Nation land. The wider context for the establishment of the camp, American Indian solidarity and resistance to disbanding the camp, as well as the government’s response, all highlight the process of settler colonialism.

In the United States, settler colonialism is defined as the control of land and its resources by white settlers who seek political power/control in a new space (i.e. like “regular” colonialism) through both displacement and violence against Indigenous persons in order to eventually replace the Native population (i.e. unlike “regular” colonialism). Until recently, studies of Indigenous people have largely been absent from sociological research and some have referred to this as sociology’s “complicity in the elimination of the native.” Scholars have begun to incorporate settler colonialism into research on the domination and dispossession of various racial and ethnic groups.
In Minnesota, American Indians face the consequences of settler colonialism everyday: generational trauma from historical violence and boarding schools while at the same time, confronting a host of contemporary inequities in health, exposure to violence and the foster care system between Natives and non-Natives. At the national level, the U.S. government’s urban relocation programs during the 1950s serve as further examples of settler colonial logic and contemporary homelessness among Minnesota’s urban Natives today and their political response. While these policies encouraged Natives to move from what were economically deprived reservations to what was promised as training and employment in urban areas, they faced intense discrimination. By 1969, unemployment among urban Natives was nearly ten times the national average and Native incomes were less than half of the national poverty level.
After the U.S. government failed to assimilate Native people through relocation in the 1950s, their attempt to end the legal status of what it meant to be a “federally recognized tribe” led to American Indian resistance across the United States and into the social movement fold of the 1960s and 1970s. Founded in 1968, the American Indian Movement was started in Minneapolis, and Minnesota is a historically important site of resistance to settler colonialism among Native peoples. American Indians continue to resist settler colonial practices and beliefs today. One example of this includes Indigenous protests against federally recognized holidays like Columbus Day and Thanksgiving, which are embedded in settler colonial stories of the past that “whitewash” events and stereotype Indigenous people. Other acts of resistance include ceremonies acknowledging genocide and other violent acts by the U.S. government. Just last spring, Dakota activists illustrated such resistance to the Walker Art Center’s decision to host a piece of a “scaffold” similar to that of 38 Dakota men who were hanged following the U.S.-Dakota War of 1862.

The “Wall of Forgotten Natives” highlights both the settler colonial practices that make such a homeless encampment possible but also demonstrate how American Indians have continually resisted settler colonial ideas and actions.


The authors respectfully acknowledge that the University of Minnesota stands on Dakota and Ojibwe peoples’ traditional lands.  

Photo by Becky Stern, Flickr CC

The newest Apple Watch can now warn users when it detects an abnormal heartbeat. While Apple may be on the cutting edge, many people have been using apps to track their food intake and exercise for some time. Social science research demonstrates that health-tracking technology reflects larger social forces and institutions.

These health-tracking apps are part of a larger trend in American medicine that researchers call “biomedicalization,” which includes a greater focus on health (as opposed to illness), risk management, surveillance, and includes a variety of technological advances.
Benefits of using these apps include empowering patients and not having to rely on doctors for knowledge about one’s body, which — as many of the apps advertise — may save time and money by potentially allowing them to avoid doctor visits. However, self-tracking may put more onus on the individual to maintain their health on their own, leading to blame for those who do not take advantage of this technology. Further, using this technology could lead to strain in doctor-patient relationships if doctors believe patients are undermining their authority.

As more and more Americans use smartphones, the promise of digital technology, including health-tracking apps, for reducing existing health disparities grows. However, the Pew Research Center shows large income and educational gaps still exist in smartphone use, meaning the health benefits of using such technology — as well as potential downfalls — for the greater population, may be a long way off.

Photo by Eva Cristescu, Flickr CC

Unaccompanied minors have been migrating to the United States from Central America for decades, but media coverage of this harrowing journey rarely focuses on the reasons behind migration. Though violence and economic peril tend to drive these migration patterns, the journey from Central America is dangerous itself, and the backgrounds of child migrants are not always well understood. Fortunately, sociological research on migration can provide context on the difficulty of the decisions and experiences involved in migration.

Structural conditions, like violence or poverty, do not alone predict unaccompanied child migration. Recent analysis finds that when a parent migrates, their child is more likely to follow them — especially compared to children who do not have parents that migrated. Indeed, unaccompanied migrant children are most likely to migrate the same year as their parents. Gender is also a factor — girls are less likely than boys to migrate when they had parents who migrated, and even less likely when these were unauthorized trips.
Some unaccompanied minors make this dangerous journey in order to flee life-threatening gang violence at home, which leads many children to claim asylum or special immigrant status upon arrival in the United States. Upon arrival, however, they may be placed in a detention center or be deported back to their countries of origin. Further, to cover costs for the journey to the United States, Central American families gather their money and take on debt. And this all may be for naught if their child is apprehended and deported back to their country of origin.
San Quentin State Prison, California. Photo by telmo32, Flickr CC

This past August, the Incarcerated Workers Organizing Committee, a labor union for prisoners, began a nationwide strike to protest against inhumane conditions, the use of solitary confinement, and precarious work in U.S prisons. Fighting prison conditions and labor precarity has been a long-standing struggle for prisoners in the United States and around the world, and social science research explains the dynamics underlying this struggle.

As ‘total institutions” prisons provide poor substitutes of basic needs, or limited access to basic items, services, and comforts like hygiene items, clean clothing, nutritious food, education, and health care services.  Consequently, prisons end up depriving inmates of their wellbeing, autonomy, sense of self-worth, and control over their fates. In the United States, the indignities of prison conditions range from major maltreatments, such as the abuse of long-term solitary confinement, to minor cruelties, such as restricting the use of showers and toilet paper (imagine being limited to just one roll per month), in overcrowded facilities.
The specific conditions of prison labor reflect long-standing contradictions. On the one hand, social science evidence suggests that providing jobs to inmates has a positive effect and can reduce their involvement in future crime activities. On the other hand, prison labor has also led to abuse and exploitation. Correctional facilities use prison labor to serve private industries, to perform cleaning and maintenance functions within facilities, or even to repair public water tanks and fight wildfires. Prison labor has also served as an instrument of economic policy in the labor market. In the 1990s, for example, rates of unemployment declined when a massive number of able-bodied working age men went to U.S. prisons.

Imprisonment often has devastating consequences for inmates, their families, communities, and society at large. Even though certain policies like prison labor may involve potential benefits, their positive effects only occur when there is a genuine effort to achieve inmates’ social inclusion. Inmates’ struggles to achieve effective changes in their living conditions therefore require sustained and special attention from the public and policy makers.

Photo by Jeffrey, Flickr CC

This post was created in collaboration with the Minnesota Journalism Center

From FiveThirtyEight to the front page of the local paper, data journalism is on the rise at media outlets worldwide. As early as 2012, Columbia Journalism Review published reports featuring examples of local and regional outlets beginning to publish stories with graphs, charts, and visualizations online. In the case of New York City news media in particular, data, analytic, and platform-based positions now account for nine percent of all media jobs — marking considerable growth since 2010. Studies also show that, in today’s journalistic job market, entry-level journalists are often expected to have skills in data journalism, social media, and analytics in addition to traditional reporting and editing skills. Social science research shows how social forces contribute to this shift. 

Legacy media organizations including the Los Angeles Times and the Washington Post produce news at breakneck speed in a 24/7 news cycle, and are constantly innovating to find the most profitable and efficient methods to distribute news to the public. Not only are online media able to host the results and illustrations of large data analysis, but some media outlets utilize computational and/or algorithmic processes and programs that automatically convert batches of data into news stories for publication.
The rise of data journalism is also self-reinforcing: As data becomes a central fixture in newsrooms worldwide, higher education institutions are developing programs to train journalists in data, analytics and programming. Established data journalism programs at higher education institutions include Columbia Journalism School and the University of Missouri’s M.S. in Data Science & Analytics, while other institutions offer data journalism courses taught by part-time, adjunct, and/or visiting instructors who specialize in the field. Not surprisingly, social scientists are beginning to track and analyze these programs.
Photo by Mike Baird, Flickr CC

The recent treatment of superstar tennis player Serena Williams provides plenty for discussions of discrimination against women of color in sports and more broadly, in public. Even before this most recent incident involving her technical violations for supposedly “aggressive” behavior against the match’s umpire, Williams received a violation in the French Open for her black athletic catsuit, despite numerous instances of white players sporting similar wear at the French Open in prior years. Serena Williams’ experience is not only familiar to adult women of color, but also to girls of color. Social science research highlights how enduring patterns of policing and regulating racial minorities begins at an early age — often within educational institutions

Schools have long served as sites for social control and discipline that hinder the educational attainment of minority youth. Girls of color experience a unique set of institutional discriminatory practices that are coded in both racialized and gendered controlling images. All too often, these images depict girls of color as overly aggressive, hypersexual, and too adult-like. One two-year ethnographic study showed that while some teachers appreciated Black girls’ assertiveness in the classroom, by and large, teachers and administrators discouraged Black girls from talking in “loud” or aggressive manners, especially when such behaviors threatened teachers’ authority. They attempted to incentive Black girls into more quiet and docile behavior in order to achieve status as young “ladies” — a status shrouded in ideals of white female innocence.
Dress code enforcement serves as one of the primary ways educational institutions police girls of color. In one study, girls were told, “Don’t come in here with no hoochie-mama dress all tight up on your butt!” Similar remarks demanded that girls “Close [their] legs” and act like ladies. At times, girls resisted these policies and the racialized and gendered stereotypes that emerged alongside them. Still, girls also participated in the regulation of their female peers’ clothing by degrading those who wore “sleazy clothes.” Such policies and practices reinforce educational institutions as sites that perpetually reproduce simultaneous race, gender, sexuality, and class inequalities.
NASA image by Jeff Schmaltz, Flickr CC
NASA image by Jeff Schmaltz, Flickr CC

Originally Posted October 19, 2016.

The rain and wind of Hurricane Matthew may have stopped, but much of North Carolina is still under water. The hard work of repairing and rebuilding has begun across the southeastern U.S. and the Caribbean, particularly in Haiti where they are still reeling from the 2010 earthquake. Hurricanes – so called natural disasters –  are not simply the result of the weather but become “disasters” because of how society shapes people’s risks and how people prepare, adapt, and respond.

Extreme weather events like hurricanes often become problems because of the ways society has changed the environment, such as locating cities in areas at risk of flooding, filling in wetlands for development, and building homes on eroding coastlines. Government policies are also major factors in where, why, and when an event becomes a human disaster because development policies have contributed to creating risky areas while response plans are often inadequate.
The risks and burdens of disasters are not evenly distributed. Communities with the least economic, social, and political power often face the greatest threats from natural disasters and are also the least able to prepare, evacuate and rebuild. Socio-economic status affects where people live and the quality of their housing. Poor and working class communities also tend to bear greater physical, emotional, and psychological impacts of displacement and have fewer resources and government support to rebuild and recover after a catastrophe.
Economic inequality and race also contribute to different levels of vulnerability and resiliency between countries around the globe. Communities of color are more likely to be threatened by environmental disasters and be less prepared, while government evacuation and reconstruction programs tend be limited for these communities. Researchers who studied Hurricane Katrina point to how experiences of the storm were shaped by institutional racism and how the effects exacerbated racial and class inequalities. For example, government aid was slower to reach African American communities who also spent more time in shoddy temporary housing and had more trouble rebuilding their neighborhoods.
Women and children also bear a greater burden and risk from disasters because they tend to have fewer resources. Women typically have more responsibilities of caring for children and aging relatives, yet they have also been leaders in the recovery process after countless disasters, organizing their communities to rebuild and demanding a government response. Natural disasters have a large impact on children due to the trauma, displacement, and disruption of their lives. Research found that childrens’ ability to respond and adapt after Katrina was related to their family’s race and class, with more vulnerable children experiencing greater detrimental effects on their well-being after the storm.
Sign in a store that says “We Accept SNAP.” Photo by ajmexico, Flickr CC

Recently, Trump advisor Stephen Miller announced plans to bar documented immigrants from citizenship if they or their families have ever used social assistance programs such as food stamps or welfare. Such action reflects stereotypes about who uses social assistance — in the United States, people of color take the blame. Not only are these stereotypes often incorrect, they are also deeply rooted in a long history of race and racism in America.

It is important to understand that racial minorities and immigrants do not necessarily use more public resources than native-born whites. Racial minorities and immigrants do tend to have lower incomes and levels of education than native-born whites, but research shows that they do not excessively use social assistance programs when compared with other groups.
Americans’ attitudes towards welfare — particularly myths that certain groups overuse programs such as welfare and food stamps — are heavily rooted in politics of race and racism. In fact, several scholars have illustrated how political and ideological opposition to social spending are shaped by racial appeals. Even in the post Civil Rights era, political figures use implicit messaging and coded language to attack social spending programs and recipients of these programs, subtly implying racial minorities overuse such programs, thus perpetuating these racist narratives.
Miller’s plan to bar citizenship for immigrants who have used social spending programs must also be understood as a consequence of historical racism in the American welfare state. During the 19th and 20th centuries, white working-class immigrants from a variety of European countries accessed social spending programs, opportunities for home ownership, and union membership due to their racial privilege.  On the other hand, Blacks and other non-white groups — including non-white immigrants– were denied the same opportunities. This heightened racial inequality while simultaneously validating racist beliefs about minorities and immigrants. In short, while Miller’s plan seems to primarily focus on immigration, it most certainly also about race.