inequality

Photo of a march for Dia Consciência Negra in São Paulo. Photo by Central dos Trabalhadores e Trabalhadoras do Brasil CTB, Flickr CC

Historically, Brazil had presented itself as a “racial democracy” where interaction between racial groups formed a utopian, raceless society. In the last few decades, Brazil has come to acknowledge its underlying racism and resulting disparities, leading to the 2001 enactment of race-based affirmative action projects. Social scientific research can help us better understand the functions and necessity of these programs, which are likely to be under threat following the election of right-wing candidate Jair Bolsonaro who has been openly critical of Blacks and the LGBT community and the policies that serve them.

In Brazil, skin color has been the defining mechanism for racial categorization and identity in between a black and white binary. Some official methods, however, including the Brazilian Census, recognize multiracial identity and a number of racial categories. Multiple methods of classification can be tricky, and this can obscure deeper and more basic racial inequalities. Though pardos (a mixed-race category used in the census) can face similar social situations as dark-skinned Blacks (sometimes referred to as pretos), recent research finds that pardos actually do experience less disadvantage than pretos, complicating decisions about who receives help from affirmative action programs.
Given this complexity, determining who is Black and a possible beneficiary of the racial quota program has differed. Some universities have required applicants to be of African descent to qualify for racial quotas, which has caused complications since many Brazilians could claim African ancestry even though they may have light skin or are not seen as “Black.” In other cases, verification committees at different universities confirm whether an applicant should be considered Black and a beneficiary of the racial quota system. The quota program has now significantly broadened to include applicants from low-income families as well as Indigenous peoples. Independent of skin color or racial group, support for affirmative action programs and race-targeted public policies is strong. Research suggests, however, that the more education someone has, the less likely they are to support racial quotas. It is important to consider what factors affect support and execution of these policies as opponents such as President-elect Bolsonaro attempt to dismantle it.

4238252310_b81af757bd_z

Originally posted November 22, 2016.

For many Americans, this weekend is the time for food, football, and family we don’t see often. Given the heightened tensions surrounding the presidential election, social media is teeming with advice on how to constructively engage with friends and family who have different political views. Avoidance, wine, and crying is one strategy, but thinking about what family meals mean and actually engaging in constructive conversations about political issues may be more fruitful.

We often think of Thanksgiving as a time to have a family meal together and strengthen family bonds. But research shows that family dinner does not actually increase well-being in and of itself – it only works if the meal-time discussion is used to actually engage with those at the table and learn about their day-to-day lives. In other words, “polite” conversation may not be the best way to bring everyone together.
We know that people avoid talking politics because they want to seem polite and avoid conflict. But this does not necessarily mean they don’t have political views. In fact, being “not political” is a cultural performance that people do with different styles. It takes work to not be political and those strategies can be overcome without necessarily causing conflict. In fact, a recent study found that having a 10-minute canvassing conversation about trans-related issues was associated with reduced prejudice, at least in the short term.
For those of us who are academics, it is important to remember that engaging in these discussions does not mean spouting off your best summary of Gramsci’s theory of hegemony or Bonilla-Silva’s take on color-blind racism. We need to do as much, if not more, listening than we do talking, because listening to how others are thinking about and responding to the current political climate can help all of us better understand our shared situation. And if and when we do bring up social science theories and research, we should do it in a way that is approachable, not pedantic. As bell hooks argues, “Any theory that cannot be shared in everyday conversation cannot be used to educate the public.”
That’s not to say that academics cannot effectively draw on their experiences as teachers. There are many strategies we use in the classroom to teach things like race, gender, and class that can be useful outside of the classroom. Relying on personal examples and discussions about family histories instead of facts and figures is one example of how to do this. Focusing on experiences that you or your loved ones have had with racial discrimination, generational mobility, or gender role conflict can help them connect the social construction of race, class, and gender to concrete events and stories from their own lives.
Photo of a Seminole man holding his child at an American Indian Heritage Month celebration. Photo by Los Angeles District, Flickr CC

After years of debate, the Indian Child Welfare Act (ICWA) — which sets minimum requirements for caseworkers handling state child custody proceedings involving Native children — was recently ruled unconstitutional by a Texas federal judge. The judge argued that ICWA violates constitutional rights to Equal Protection because it “elevates a child’s race over their best interest” — despite the fact that Native children are actually citizens of federally recognized tribes. Social science research helps us understand the historical context necessitating ICWA’s creation, with respect to the problematic history of child removal from Native communities as shaped by racialized, gendered, and cultural ideas.

The ICWA was enacted in 1978, a time when Native children were being removed from their homes and placed in foster care at staggering numbers under the guise of protecting children. At that time, 25-35% of Native children were removed from their homes by state child welfare or by private adoption companies. And the majority (about 85%) of these children were placed outside of their families and communities, even when relatives were willing to take them. Today, despite the minimal protections offered by the ICWA, Minnesota places more Native children in foster care than any other state, making up 20% of children in the system.
The ICWA’s creation and implementation has not only been a response to child-removal through adoption, however. Even earlier, Native children were sent to government or Christian-run boarding schools where teachers forced children to abandon their distinct tribal cultures — they cut Native children’s hair, did not allow them to speak their native languages or participate in cultural practices, and enforced strict discipline through corporal punishment. The boarding school era prevented generations of Native people from learning (and passing on) parenting tools. This separation of families, along with the disruption to Native cultural and spiritual practices, has been linked to symptoms similar to post-traumatic stress disorder, increased exposure to physical violence, and substance abuse in Native communities.
The removal of Native children is also couched in deep-set racialized, gendered, and cultural notions of family, specifically the white middle class ideal of the nuclear family, characterized by two married parents and children. Conversely, non-Native supporters of these adoption practices often relied on stereotypes of Native women as sexualized, unmarried, and thus unfit, which pathologized Native families as neglectful. They have also argued that each child’s best interests should be considered on an individual basis, rathering than acknowledging what tribes see as the importance of culture and identity, tribal rights, and belonging. In other words, supporters of Native adoption saw “disadvantaged” Native children that needed to be “rescued” by individual acts of goodwill (from white, middle class Americans).

So what will legal reconsideration of the Indian Child Welfare Act bring? Many tribes fear that the Texas ruling sets a dangerous precedent that could dismantle the federal laws put in place to correct historical injustices like the boarding school system. Other tribal leaders see the ruling as an attempt to destroy their right to political and cultural survival through their children, while simultaneously compromising efforts to heal from the wrongdoings inflicted upon tribal communities. In the context of the current political division over the treatment of immigrant children separated from their parents at the U.S. border, such concerns warrant serious attention.

Photo of a protester holding a sign that says, “we are all immigrants.” Photo by Alisdare Hickson, Flickr CC

Politicians, pundits, and critics in Germany, England, and the Netherlands have recently advocated for harsher restrictions on migrants’ access to social assistance in their countries. This has led scholars to evaluate whether increased immigration is eroding historically strong support for welfare in Europe.

Earlier in the decade, the answer seemed clear. Drawing on basic public opinion data from various European countries, scholars found that rising immigration levels preceded a spike in favorability for restrictive welfare laws. More recent and sophisticated analyses, however, suggest that a rise in restrictive welfare attitudes is not directly connected to increasing immigration. Rather, this relationship appears to be better explained by a combination of factors such as national economic conditions, political ideology, individuals’ self-interest, and prejudice towards racial and ethnic minorities.

This work shows that social attitudes about welfare are complex and linked to a variety of factors. Though critics of immigration in Europe have been vocal, it is unclear exactly whether and how attitudes about immigration and migrants relate to beliefs about welfare.

Photo by Tom Lee, Flickr CC

Originally posted October 18. 2017

If you like Halloween, you know that witches are a popular costume choice and decoration this time of year. But the history of witches involves much more than bubbling cauldrons and flying broomsticks. Social science shows us that witchcraft has a long history of empowering marginalized groups, like women and sexual minorities, who question more traditional religious practices.

While popular images of witches often focus on magic spells, brooms, and pointed hats, witchcraft and other forms of neo-paganism have historically been used by women to push back against male-dominated religions. More traditional, hierarchical religions like Christianity and Islam often place women in a subordinate role to men, and research finds that many women are drawn to witchcraft and other alternative spiritualities because they emphasize female empowerment, embodied rituals, and sexual freedom.
People who practice witchcraft and neo-paganism typically see sexuality and gender as key sites for social transformation and personal healing, pushing back against the Christian idea that sex and bodies are sinful. Since neo-paganism values sexual freedom and sexual diversity, LGBTQ folks and people practicing polyamory often feel a sense of belonging that they don’t find in other religious spaces.
This has also been true for young adults. In general, young adults practice religion and spirituality differently than do older generations. For example, millennials are the least likely to participate in traditional religious institutions or identify with one single religious belief system, but many still desire some combination of spirituality and community. The increase in portrayals of witchcraft and other neo-pagan religions in popular media has exposed younger generations to these communities, and research finds that teens are more often drawn to these alternative spiritual practices as a means of self-discovery and community, rather than the promise of magical powers.
Photo of a graffitied Star of David in the Palestinian city of Hebron. Photo by JD Lasica, Flickr CC

The Palestinian-Israeli conflict is a recurrent and divisive topic in international headlines. In the last year, the Trump administration has positioned itself as a firm supporter of Israel by recognizing Jerusalem as its capital and cutting aid to Palestinian territories. These moves have drawn praise and criticism in the United States and abroad, with commentators speculating what this might mean for an eventual peace treaty. Sociology can help explain why this conflict has persisted for so long and how a breakthrough might finally be achieved.

When violence is directed toward a group rather than an individual, suffering gains a social dimension. An identity of victimhood is constructed. Not all tragic pasts lead to this label; rather, groups that experienced violence and groups in power negotiate the use of the term, “victim.” This identity is particularly strong among Palestinians and Israelis, with both groups highly conscious of the historical and contemporary harm they have suffered from the other. Collective victimhood functions to maintain a group’s moral self-image and to bolster in-group solidarity. It can also justify violence against the out-group motivated by “self-defense.”
Victim status is projected to third-parties to win sympathy and support. Resources for humanitarian aid are limited and are generally distributed to groups that the international community considers to be the primary or sole victim of a conflict, especially in recent decades. Accordingly, groups fight for control of the victim identity. This process, known as competitive victimhood, involves each side claiming that it has suffered more unjust violence than the other group. Competitive victimhood perpetuates the victim identity, which can make reconciliation more difficult.
Recognizing the negative impact of competitive victimhood can suggest a path toward peace. Adopting a common victimhood identity reduces competitive victimhood and increases willingness to forgive. One experiment showed that Israelis express less support for aggressive policies against the Palestinians if they read a narrative emphasizing suffering on both sides. The effectiveness of individual interventions is suggestive, but large-scale reconciliation requires the social construction of a common victimhood identity in public discourse.

Competitive victimhood results from the black-and-white categories used to distribute blame and sympathy in inter-group conflicts. Moving beyond this dichotomy may improve the odds of eventually securing peace between Israelis and Palestinians, as well as solving other intractable disputes.

Photo of a protester holding a sign that says, “Stop deportation! I need my daddy,” with a picture of a child. Photo by Fibonacci Blue, Flickr CC

This post was created in collaboration with the Minnesota Journalism Center

A Way Out of the Immigration Crisis.”Like it or Not, Immigrant Children are Our Future.” “Trump Talks Tough on Immigration in Nevada but it Could Backfire.

All three of these news headlines were published by mainstream news outlets in the United States in September 2018. All three portray immigration in a negative light. While much news and media coverage across the globe portrays immigration negatively, researchers who study this area also identify media outlets that include more positive coverage. 

One thing scholars have found is that immigration coverage has increased rapidly since 2004. In U.S. coverage, there is a heavy focus on “immigration reform” and “tougher border control” as solutions to the immigration “problem.” Across the pond in the United Kingdom, polls have ranked immigration as one of citizens’ top concerns, with coverage of immigration being overwhelmingly negative and centered on conflict through utilization of crime frames and discussion of immigrants as “illegal” or “failed.”
But not all publications discuss immigration in the same way. For example, scholars have found that African-American media outlets publish immigration stories with racial frames that depict immigrants as allies, while other major mainstream media outlets frequently utilize crime frames that discuss the legal status of immigrants.
Research also shows suggests that the conflict-oriented news coverage of immigration contributes to polarized public opinions about immigration policies in digital spaces as well as comment sections of mainstream news sites. Conversely, social and mobile media in Australia has been used as a platform for asylum-seekers in detention to share their experiences with the public directly. Journalists and citizens alike have collaborated with detained asylum-seekers to create journalistic narratives that raise awareness of human rights violations unfolding in offshore detainment centers. Though not embraced by all consumers, such stories about immigration appear to cultivate expressions of empathy from audiences.
Photo by PictureCapital, Flickr CC

People invent words and definitions to help navigate the world around them. Once created, such labels can have monumental impacts. The word “genocide” is one example of a term that holds meaning for victims, perpetrators, and those who watch violence unfold. Over the past few years, debates have raged as to whether or not to call the Burmese state’s violence against the Rohingya “genocide.” Such debates often privilege the label, rather than focusing on the everyday violence experienced by civilians. Sociologists seek to understand the meaning, use, and consequences of labels like genocide.

Individuals and groups use institutions to construct consensus about labels. But people use terms in different ways, and labels often change in meaning over time. The term “genocide” was first coined by a lawyer, Raphael Lemkin, in the aftermath of the Holocaust. The United Nations adopted this term and formalized genocide as a crime in 1951, but the meaning of genocide continues to be contested. Some academics, for example, advocate for the inclusion of political groups as targets for genocide in addition to collectivities that are already included, like ethnic or religious groups.
Debates about labels have real effects. In the case of genocide, such implications are most directly felt by populations affected by violence. Victims can feel that their loss is recognized and mourned when appropriate labels are used, while an insufficient label may promote impunity for past crimes. As some theorists argue that an acknowledgement of past wrongdoing is central to healing, the use of fitting labels takes on an even more practical importance.
Despite these important considerations, some scholars and activists express concern that focusing on labels does more harm than good. The label of “genocide,” though critically important to survivors and advocates, does not come with legal obligations to intervene. While policymakers and activists discuss the relevance of the term “genocide” in Burma, atrocity crimes continue to unfold. As such, some scholars argue that the social importance of labels can distract from the immediate needs of victims of violence. From this vantage point, scholarly and advocacy attention is best directed towards serving the needs of those impacted by violence.
Photo of Indigenous Women, some holding children, outside of a Church in Chiapas, Mexico. Photo by Adam Jones, Flickr CC

More and more Americans have begun observing Indigenous Peoples Day, at least in part to push back against national narratives of “discovery” associated with Christopher Columbus and his commemoration. While a relatively recent development in the United States, other nations of the Americas officially acknowledged the importance of their Indigenous heritage for much longer. For example, in Mexico, Día de la Raza or “The Day of the Race” was officially recognized back in 1928 and was part of a larger national project that emphasized that all Mexicans share a history of racial and cultural mixing — known as mestizaje — since the coming of the Spanish. Sociological research highlights how this recognition of Indigenous people as integral to the formation of the nation has actually had mixed consequences for Indigenous peoples and cultures of Mexico.

The notion of mestizaje emphasized that all Mexicans were fundamentally “mixed” individuals, or “mestizos.” It was adopted by the State in an effort to promote inclusion and national cohesion across racial lines — including Indigenous peoples — and even resulted in the removal of racial categories from the census (after 1921). In this spirit, the Mexican government sought to “improve” Indigenous individuals through education, social integration, and economic development, assimilating them into the newly defined mestizo nation. While some benefited, some lost their language and cultural identity, and many others, especially those with darker skin, faced further marginalization and found themselves pushed underground.
Due to internal and external political pressures in the 1990s, the Mexican government abandoned its assimilationist policies and began instead to protect and promote the languages and cultures of Indigenous peoples. These shifts appear to have contributed to greater ethnic pride and greater likelihood to self-identify as Indigenous, especially for more educated, wealthier, or urban populations. However, skin color continues to carry economic and social weight in Mexico. Non-white Mexicans tend to have lower levels of education and employment and are more likely to live in poverty than their lighter-skinned peers.

So, while Mexico may still celebrate a day to acknowledge its mixed racial heritage, it is worth wondering if there might be other, better ways to recognize and address the challenges that actual Indigenous people in the country face on a day to day basis.

Photo by Becky Stern, Flickr CC

The newest Apple Watch can now warn users when it detects an abnormal heartbeat. While Apple may be on the cutting edge, many people have been using apps to track their food intake and exercise for some time. Social science research demonstrates that health-tracking technology reflects larger social forces and institutions.

These health-tracking apps are part of a larger trend in American medicine that researchers call “biomedicalization,” which includes a greater focus on health (as opposed to illness), risk management, surveillance, and includes a variety of technological advances.
Benefits of using these apps include empowering patients and not having to rely on doctors for knowledge about one’s body, which — as many of the apps advertise — may save time and money by potentially allowing them to avoid doctor visits. However, self-tracking may put more onus on the individual to maintain their health on their own, leading to blame for those who do not take advantage of this technology. Further, using this technology could lead to strain in doctor-patient relationships if doctors believe patients are undermining their authority.

As more and more Americans use smartphones, the promise of digital technology, including health-tracking apps, for reducing existing health disparities grows. However, the Pew Research Center shows large income and educational gaps still exist in smartphone use, meaning the health benefits of using such technology — as well as potential downfalls — for the greater population, may be a long way off.