Photo by Mark Dixon, Flickr CC

Neo-Nazi swastikas, explicitly racist chants and slogans, and public demonstrations with hoods and torches, as seen recently in places likes Charlottesville, are what signal white supremacy for many Americans. Yet, for over a decade, activists and policy makers have used the phrase “white supremacy” in different ways, moving beyond extremist ideologies and individuals’ bigoted beliefs to focus on the deep historical structure and institutional dimensions of racial inequality in social life. Perhaps not surprisingly, sociologists have been at the forefront of parsing out this broader usage and meaning of white supremacy.

Rather than focusing solely on explicit prejudice and organized hate groups, recent sociological uses of the term describe how the very nature of American society inherently privileges white people, white identities, and the status of whiteness. This includes how white people fare better in economic terms, as well as how white people experience superior outcomes in other ways, such as education and health, and how all of these systemic inequalities happen through established institutional arrangements, cultural norms, and public policies. For scholars with this emphasis, America is a “white supremacist” nation — not because individuals or the law are explicitly prejudiced, but because white privilege is central to American social life.
This is not to suggest that sociologists and other social scientists have neglected the study of extremist white groups like Neo-Nazis or the KKK. In fact, sociologists have continued to track how more traditional white supremacists have evolved alongside changing social backdrops and history. These scholars have documented how white supremacist movements in the 21st century have been shaped by whites’ perceptions of victimhood following increased immigration, globalization, and diversity in America.

With all of these different strands of research and interpretations of white supremacy, it is imperative for all of us — activists and analysts alike, as well as everyone in between — to be thoughtful and cautious about how, when, and in what company we use the term “white supremacy.”

Photo by faungg’s photos, Flickr CC

As the fall final exam season creeps up, students are returning to their notes and — hopefully — recalling everything they learned this semester. But what kind of notes do they have, and will those notes be helpful? We wondered whether taking notes via pen and pencil versus typing made a difference for students. Here’s what we found!   

Technology isn’t going away in the classroom. School districts across the country are getting grants from governments and tech companies to expand their technology options, especially increasing access to technology for traditionally underserved populations and experimenting with new forms of content delivery. But researchers have looked into the potential negative effects of technology on learning, especially the multitude of potential distractions for students using laptops in class. They find that college students who have laptops in lectures are on average less engaged, less satisfied with their education, and perform worse than other students.
In experimental studies, students who used laptops were more likely to write down exactly what was said, which involved less thinking and processing during the notetaking process. Students who took notes longhand were better prepared to answer conceptual questions on the content, even when those who took more extensive notes on laptops were able to study their notes before the quiz.
On the other hand, researchers have argued that technological innovation and changes in classrooms may make notetaking an out-of-date skill altogether. This research focuses on inclusion and the potential ability for technology to assist students with physical or cognitive disabilities.
Photo by Mike Schmid, Flickr CC

Benefits for the wealthiest Americans in Republicans’ proposed tax plan are causing alarm among some Americans, especially because they risk widening an already large wealth gap. According to a recent analysis, the three richest Americans control more wealth than half of the United States’ population. Wealth is different from income, because it takes into account assets like property and debts in addition to earnings, which means wealth inequality in the United States is much greater than income inequality. Social scientists demonstrate that the amount of wealth a person accumulates is associated with a variety of social advantages. And once someone has accumulated wealth, the benefits continue to build up over time and across generations.

Income matters for wealth accumulation, but that is not the only factor. Homeownership is particularly important, though age, wealth of parents, level of education, religion, race, and gender also influence the wealth a person acquires. For instance, unmarried women’s wealth on average is lower than men’s, and a significant gap exists between whites and Blacks in America — a gap that only gets wider in the top tax bracket.
Having wealth is an important indicator of future wealth and well-being. Parents’ wealth is often associated with greater well-being for their children, including higher educational and occupational attainment. Personal wealth also partially explains the gap in marriage rates between people with high and low education levels.
Los Angeles March for Immigrant Rights. Photo by Molly Adams, Flickr CC

The Trump administration recently discontinued an Obama administration policy known as Deferred Action for Childhood Arrivals (DACA). Established in 2012, DACA provided steps toward permanent residency for undocumented immigrants who have lived in the United States since childhood. Conditions include proof of living in the United States since before age 16, criminal background checks, status as employed or a college-student, and routine renewal and payment every two years. Furthermore, DACA recipients are ineligible for federal welfare, student aid, and citizenship. Public figures, including pundit Ann Coulter and Attorney General Jeff Sessions, have expressed concerns about immigrants taking too many jobs, draining social support programs, and threatening American culture and ways of life. Research shows, however, that these individuals are not a threat to American culture and society — they are in fact a part of American culture and society.

Many DACA recipients are deeply enmeshed in American society, as virtually all have lived in the United States since childhood. Since the DREAM act was originally passed, beneficiaries have experienced better mental and physical health outcomes. Furthermore, the legislation has offered people in precarious positions a way to be better incorporated into their society; nearly all are employed, speak English as a first language, and have family ties to the United States. Rescinding these protections, therefore, could possibly lead to adverse impacts on recipients’ well-being and lifestyle.
The Trump administration has left Congress time to either renew or repeal DACA. Even if DACA is continued, however, research has shown that state and municipal governments vary greatly in the support they provide to undocumented immigrants. So, even if DACA survives at the federal level (which is not at a guarantee), variations in state and local governments could lead to vastly different outcomes for people in different regions across the country.
Photo by Steve Snodgrass, Flickr CC

This past week, the Philadelphia Board of Pensions and Retirement voted to withdraw its investments in the for-profit prison industry. However, the prison industry depends on more than just investors to finance its operations. It also relies on resources from defendants, inmates, and their families. Social science research demonstrates the far-reaching consequences of the penal systems’s money leveraging strategies.

Federal and state criminal justice agencies and correctional institutions charge defendants and inmates with the costs of arrest, prosecution, conviction, incarceration, and supervision. For example, fees include the cost of electronic monitoring and registration for people convicted of sex offenses. In some states, defendants pay for their hearings (court-fees). If found guilty, they also pay room-and-board fees while in prison (pay-to-stay fees).
As a form of punishment, judges impose monetary sanctions for misdemeanor and felony crimes alike. Monetary sanctions disproportionately disadvantage defendants from low-income communities through three different mechanisms: reducing family income, limiting their access to jobs or educational opportunities, and increasing the likelihood of ongoing criminal justice system involvement. These consequences challenge the assumption that monetary sanctions serve as a more favorable alternative to incarceration or supervision.
Correctional authorities outsource the operation and provision of services within correctional institutions to generate revenues for both public and private institutions. Contracts to run prison services – commissaries, telephone services, or online banking, for example – are based on commissions (what critics call “kickbacks”), which generate incentives for corruption and disproportionate profit-making at the expense of inmates and their families. This means companies have higher incentives to increase their profit margins by charging higher prices and fees.
WE ARE A NATION AND WE HAVE THE RIGHT TO DECIDE! Catalonian Independence Protest. Photo by Paco Rivière, Flickr CC

Recent events in Burma, the United States, and Spain have shown how appeals to nationalism can initiate or heighten violence. Nationalist ideologies, however, look quite different in each of these countries, and many countries with strong national identities do not experience these types of conflict at all. Sociological research helps explain how nationalism develops differently from one country to the next and the consequences that result.

Nationalism is a particularly strong form of identification, as it can surpass personal connections and reinforce a shared bond throughout the borders of a nation. Social identity can help people define their place in the world, and nationalism can provide a positive way through which to do so. It can also be used to advocate for national-scale interests on a global level, promoting diverse perspectives in international institutions. This is especially true when a country was created through a more spontaneous process, where national identity develops simultaneously with the broader identity formation of groups already living in a particular area.
But the path to nationhood isn’t always so organic. Many nations were originally created through decades or centuries of violence and oppression. In other words, national identity works differently when it interacts with different kinds of state power. A majority of countries in the Global South began with ambiguously drawn borders created with the intent of domination. In such states, nationalism stems from (oftentimes violent) renegotiations of identity following foreign rule.
These different pathways to nationhood result in dramatically different forms of nationalism across the globe. Civic nationalism, for example, is based on citizenship as the root of belonging, while ethnic nationalism is grounded in ethnic identity. Ethnic nationalism tends to be more prominent in nations that have experienced more conflict over time. It can also be more exclusionary, with some studies finding lower tolerance for immigrants in more ethnically-nationalist societies. These two forms can also blend together, as civic nationalism can express quieter assumptions about ethnic belonging.
Shepard Fairey’s work on the streets of San Francisco. Photo by Michael Pittman, Flickr CC

Political spectators anxiously await a final decision from the Supreme Court on the Wisconsin gerrymandering case, Gill v. Whitford. Gerrymandering occurs when legislators redraw voting districts in order to concentrate their electoral dominance. This highly anticipated judicial decision could stop gerrymandering practices and require courts around the country to search for bias in their district maps. While voting is the cornerstone of democracy, social science research on gerrymandering suggests that democratic ideals may not match up to how voting works in practice.

Wisconsin redistricting plans that were ratified in 2011 gave Republicans an advantage over Democrats in translating votes into seats in the legislature. Computer simulations can diminish partisanship in district drawing, but it remains unclear how effective this would be in reducing political polarization in Congress. One study suggests that redistricting does appear to diminish electoral competition, but does not appear to exacerbate polarization along party lines.  
Political sociologists have shown that full voting rights are not as guaranteed in the United States as in many other major democracies, and gerrymandering is just one example of practices that lead to the under-representation of low-income voters and communities of color in the electoral process. For example, partisan gerrymandering reduced access to communication between ward residents, local nonprofits, and their political representatives in Chicago. There is also evidence it changed voters’ choices in Georgia. In short, gerrymandering has real consequences for racial inequalities and representation in the United States.
Photo by Harold Navarro, Flickr CC

Immigration is a hot-button issue in American politics today. President Trump’s proposed border wall, rescinding of DACA, travel bans for multiple majority-Muslim countries, and increased detention and deportation have meant that the debate has focused almost exclusively on Hispanics and Muslims. This is the latest in a long history of misgivings towards immigrants that has obvious racial dimensions. It’s easy to forget that much anti-immigrant rhetoric is based on American attitudes about who is white, or who has the potential to become white. Social science research reminds us how certain groups who were once cast as racial outsiders eventually came to be seen as “white,” while others have been consistently denied white status and the full citizenship that comes with it.

The meaning of “white” has changed through the course of American history. From the 19th century into the early 20th century, “white” only incorporated Anglo-Saxon, Protestant Americans. American voters and policymakers were concerned that “non-white” immigrant groups such as the Irish, Poles, Jews, and Italians lacked the ability to assimilate into American society. Gradually, however, these immigrants became incorporated into the dominant racial category and were thus no longer considered outsiders.
This did not apply to all immigrant groups, however. Despite the historical flexibility of the category, whiteness never encompassed everybody. Courts, laws, and pseudoscience defined whiteness in ways that excluded some groups from full citizenship in America. Many immigrant communities—such as West Indians, Hispanics, and the Chinese—found themselves in racial categories that shaped their access to various socioeconomic opportunities, belonging, and citizenship.
Photo by stephalicious, Flickr CC

From sexual harassment to salary gaps, stories about gender inequality at work are all over the news. How does this happen? Social science research finds that people often place into different jobs by gender, race, and class, and this sorting has consequences for inequality in earnings and career prestige. Just like a middle school dance where students congregate on opposite sides of the floor because of both self-sorting and social norms, gendered occupational segregation comes from a combination of choice and implicit discrimination based on workplace “fit.” Women often choose less prestigious occupations based on how they perceive their personalities and competence, and employers and colleagues tend to favor people like themselves when hiring, promoting, and collaborating.

When people choose their jobs, they often think about careers to match their personalities. Gender socialization and stereotypes about competence, personality traits, and innate abilities influence how women and men consider which  jobs are right for them. Many women learn to perceive themselves as emotional, systematic, or people-oriented. They also tend to think they possess the right traits to work in female-dominated jobs like teaching and nursing. Women are more likely to think they will perform poorly at careers in science, technology, math, and engineering (STEM) because they have learned to think they are not “naturally” as good at these subjects as men are.
Outright gender-based discrimination in hiring and workplace practice is illegal, but it still occurs through implicit biases to the detriment of women. Employers often look for people who will blend well with their workplace’s culture, and this results in hiring candidates similar to themselves, in terms of both gender and social class. Once hired, colleagues tend to collaborate and share resources with those they think are like them as well, often isolating women in male-dominated workplaces. As a result, many women leave highly-paid, highly-skilled positions in favor of less prestigious jobs with more women and friendlier environments.
Photo by Gage Skidmore, Flickr CC

Donald Trump was recently the first sitting president to address the Values Voter Summit in Washington, D.C., where he referenced “attacks” on Judeo-Christian values. But what does this “Judeo-Christian” buzzword really mean? Social science research shows us that national identity is a style of political engagement that can change over time, but also that these cultural changes have real stakes for the way Americans think about their fellow citizens. While the U.S. is becoming an increasingly racially and religiously diverse nation, this demographic change comes up against the persistent cultural assumption that Americans share a distinct Christian identity and heritage.

The meaning of “Judeo-Christian” has changed over time. Once referring to progressive political coalitions, it became a rallying cry that designated socially conservative positions in the “culture wars” of the 1980s and beyond. This case shows us how nationalism is a cultural style comprised of different beliefs and identities. This means that political leaders and everyday citizens can draw on different styles of nationalism.
And these styles of nationalism have real stakes. An emerging trend in public opinion literature shows that Christian nationalism in particular is a strong predictor of negative attitudes toward minority groups. For example, respondents high on this kind of nationalism are also less likely to support interracial and same-sex marriage.