2 (1)Compared to the less powerful, more powerful people feel more entitled to be treated fairly, are quicker to identify an instance in which they are mistreated, and more likely to take action in response.

These are the findings of a new study by social psychologist Takuya Sawaoka and colleagues. They defined power as “disproportionate control over other people’s individuals’ outcomes.” I imagine someone who is a boss, perhaps, or a police officer, professor in a classroom, or patriarch of a family, or even just people who are wealthy and can pretty much pay people to do anything they want.

The scholars review the literature showing that people with power are entitled to a disproportionate share of resources and more likely to cheat, steal, and lie. They hypothesize that this “individual variability in entitlement shapes people’s reactions to injustices that they experience” and designed a series of studies to test it.

In the first study, participants were primed to feel either powerful or powerless by being asked to write about and reflect on a situation in which they felt they had power over someone else or, alternatively, someone had power over them. They were then instructed to play a game with a confederate (unknown to them) who had ten tokens that they could divide up however they pleased. Participants who had been primed to feel low power expected to get less than half the tokens, but participants who had been primed to feel powerful expected a fair outcome.

They then tested individuals’ sensitivity to unfairness. They showed people primed to feel powerful and powerless distributions of tokens that looked liked this, but with varying amounts, and asked them to indicate whether the distribution was fair or unfair.

2 (1)

Their measure of sensitivity was how quickly the person identified the distribution as unfair. Their findings showed that, when they were the victim of unfairness (see the second pair of columns from the left), people feeling powerful were quicker to identify it as unfair (a lower bar = faster) than were people feeling powerless.

But, when they benefited from unfairness (see the pair of columns on the far right), people feeling powerful were slower to identify it as unfair than when they were the victims and slower than people who felt powerless.

2

They had similar findings when people primed to feel powerful didn’t directly benefit, but simply observed other people being treated unfairly. And, they tested whether their findings extended to interpersonal justice, too, by asking how people responded to being socially excluded. They found the same pattern.

Finally, they found that, when being treated unfairly, participants primed to feel powerful were quicker to take action than those primed to feel powerless. The two columns on the left below show that high power people quickly left a hypothetical employer for a different one if they were treated unfairly.

3

So, to conclude, people who are primed to feel powerful feel entitled to fair treatment — both economically and socially — and are quick to recognize and correct it when they are treated unfairly, but they are significantly less likely to notice or care when the less powerful are injustice’s victims.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

2 (1)Earlier this year a CBS commentator in a panel with Jay Smooth embarrassingly revealed that she thought he was white (Smooth’s father is black) and this week the internet learned that Rachel Dolezal was white all along (both parents identify as white). The CBS commentator’s mistake and Dolezal’s ability to pass both speak to the strange way we’ve socially constructed blackness in this country.

The truth is that African Americans are essentially all mixed race. From the beginning, enslaved and other Africans had close relationships with poor and indentured servant whites, that’s one reason why so many black people have Irish last names. During slavery, sexual relationships between enslavers and the enslaved, occurring on a range of coercive levels, were routine. Children born to enslaved women from these encounters were identified as “black.” The one-drop rule — you are black if you have one drop of black blood — was an economic tool used to protect the institution of racialized slavery (by preserving the distinction between two increasingly indistinct racial groups) and enrich the individual enslaver (by producing another human being he could own). Those enslaved children grew up and had children with other enslaved people as well as other whites.

In addition to these, of course, voluntary relationships between free black people and white people were occurring all these years as well and they have been happening ever since, both before and after they became legal. And the descendants of those couplings have been having babies all these years, too.

We’re talking about 500 years of mixing between blacks, whites, Native Americans (who gave refuge to escaped slaves), and every other group in America. The continued assumption, then, that a black person is “black” and only “mixed race” if they claim the label reflects the ongoing power of the one-drop rule. It also explains why people with such dramatically varying phenotypes can all be considered black. Consider the image below, a collage of people interviewed and photographed for the (1)ne Drop project; Jay Smooth is in the guy at the bottom left.

2 (1)

My point is simply that of course Jay Smooth is sometimes mistaken for white and it should be no surprise to learn that it’s easy for a white person — even one with blond hair and green eyes — to pass as black (in fact, it’s a pastime). The racial category is a mixed race one and, more importantly, it’s more social than biological. Structural disadvantage, racism, and colorism are real. The rich cultural forms that people who identify as black have given to America are real. The loving communities people who identify as black create are real. But blackness isn’t, never was, and is now less than ever before.

Cross-posted at Pacific Standard.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

2 (1)Singer-songwriter Hozier played “guess the man buns” on VH1, and Buzzfeed facetiously claimed they had “Scientific Proof That All Celebrity Men are Hotter with Man Buns.” Brad Pitt, Chris Hemsworth, and David Beckham have all sported the man bun. And no, I’m not talking about their glutes. Men are pulling their hair back behind their ears or on top on their heads and securing it into a well manicured or, more often, fashionably disheveled knot. This hairstyle is everywhere now: in magazines and on designer runways and the red carpet. Even my neighborhood Barista is sporting a fledgling bun, and The Huffington Post recently reported on the popular Man Buns of Disneyland Instagram account that documents how “man buns are taking over the planet.”

x

At first glance, the man bun seems a marker of progressive manhood. The bun, after all, is often associated with women—portrayed in the popular imagination via the stern librarian and graceful ballerina. In my forthcoming book, Styling Masculinity: Gender, Class, and Inequality in the Men’s Grooming Industry, however, I discuss how linguistic modifiers such as manlights (blonde highlights for men’s hair) reveal the gendered norm of a word. Buns are still implicitly feminine; it’s the man bun that is masculine. But in addition to reminding us that men, like women, are embodied subjects invested in the careful cultivation of their appearances, the man bun also reflects the process of cultural appropriation. To better understand this process, we have to consider: Whocan pull off the man bun and under what circumstances?

I spotted my first man bun in college. And it was not a blonde-haired, blue-eyed, all-American guy rocking the look in an effort to appear effortlessly cool. This bun belonged to a young Sikh man who, on a largely white U.S. campus, received lingering stares for his hair, patka, and sometimes turban. His hair marked him as an ethnic and religious other. Sikhs often practice Kesh by letting their hair grow uncut in a tribute to the sacredness of God’s creation. He was marginalized on campus and his appearance seen by fellow classmates as the antithesis of sexy. In one particularly alarming 2007 case, a teenage boy in Queens was charged with a hate crime when he tore off the turban of a young Sikh boy to forcefully shave his head.

A journalist for The New York Times claims that Brooklyn bartenders and Jared Leto “initially popularized” the man bun. It’s “stylish” and keeps men’s hair out of their faces when they are “changing Marconi light bulbs,” he says. In other words, it’s artsy and sported by hipsters. This proclamation ignores the fact that Japanese samurai have long worn the topknot or chonmage, which are still sported by sumo wrestlers.

x

Nobody is slapping sumo wrestlers on the cover of GQ magazine, though, and praising them for challenging gender stereotypes. And anyway, we know from research on men in hair salons and straight men who adopt “gay” aesthetic that men’s careful coiffing does not necessarily undercut the gender binary. Rather, differences along the lines of class, race, ethnicity, and sexuality continue to distinguish the meaning of men’s practices, even if those practices appear to be the same. When a dominant group takes on the cultural elements of marginalized people and claims them as their own—making the man bun exalting for some and stigmatizing for others, for example—who exactly has power and the harmful effects of cultural appropriation become clear.

Yes, the man bun can be fun to wear and even utilitarian, with men pulling their hair out of their faces to see better. And like long-haired hippies in the 1960s and 1970s, the man bun has the potential to resist conservative values around what bodies should look like. But it is also important to consider that white western men’s interest in the man bun comes from somewhere, and weaving a narrative about its novelty overlooks its long history among Asian men, its religious significance, and ultimately its ability to make high-status white men appear worldly and exotic. In the west, the man bun trend fetishizes the ethnic other at the same time it can be used to further marginalize and objectify them. And so cultural privilege is involved in experiencing it as a symbol of cutting-edge masculinity.

Kristen Barber, PhD is a member of the faculty at Southern Illinois University. Her interests are in qualitative and feminist research and what gender-boundary crossing can teach us about the flexibility of gender, the mechanisms for reproducing gender hierarchies, and the potential for reorganization. She blogs at Feminist Reflections, where this post originally appeared.

Americans have a low opinion of Congress. Less than 10% of the voters think that Congress is doing a good job. But their own Representative . . . not so bad. A third of us think that our own rep deserves re-election (Rasmussen). Even that is low. Until recently, a majority of people approved of their own representative while disapproving of Congress in general. It’s been the same with crime. People feel safer in their own neighborhoods than elsewhere, even when those other neighborhoods have less crime.

Race relations too are bad . . . elsewhere. In the last year, the percent of Americans saying that race relations in the country are “bad” doubled (roughly from 30% to 60%). That’s understandable given the media coverage of Ferguson and other conflicts centered on race. But people take a far more sanguine view of things in their own community.  Eighty percent rate local race relations as “good,” and that number has remained unchanged throughout this century. (See this post  from last summer.)

Not surprising then that the problem with marriage in the US turns out to be about other people’s marriages. A recent survey asked people about the direction of their own marriage and marriage in the US generally.3
Only a handful of people (5%) see marriage generally as getting stronger. More than eight times that say that their own marriages have strengthened. The results for “weaker” are just the reverse. Only 6% say that their own marriage has weakened, but 43% see marriage in the US as losing ground.

Why the “elsewhere effect”? One suspect is the media bias towards trouble. Good news is no news.  News editors don’t give us many stories about good race relations, or about the 25-year drop in crime, or about the decrease in divorce.  Instead, we get crime and conflict and a variety of  other problems. Add to this the perpetual political campaign with opposition candidates tirelessly telling us what’s wrong.  Given this balance of information, we can easily picture the larger society as a world in decline, a perilous world so different from the one we walk through every day.

At first glance, people seeing their own relationships as good, others’ relationships as more strained seems like the opposite of the pluralistic ignorance on college campuses. There, students often believe that things are better elsewhere, or at least better for other students. They think that most other students are having more sex, partying more heartily, and generally having a better time than they are themselves. But whether we see others as having fun or more problems, the cause of the discrepancy is the same – the information we have. We know our own lives first hand. We know about those generalized others mostly from the stories we hear. And the people – whether news editors or students on campus – select the stories that are interesting, not those that are typical.

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Flashback Friday.

A study by doctor Ruchi Gupta and colleagues mapped rates of asthma among children in Chicago, revealing that they are closely correlated with race and income. The overall U.S. rate of childhood asthma is about 10%, but evidence indicates that asthma is very unevenly distributed. Their visuals show that there are huge variations in the rates of childhood asthma among different neighborhoods:

Photobucket

The researchers looked at how the racial/ethnic composition of neighborhoods is associated with childhood asthma. They defined a neighborhood’s racial make-up by looking at those that were over 67% White, Black, or Hispanic. This graph shows the percent of such neighborhoods that fall into three categories of rates of asthma: low (less than 10% of children have asthma), medium (10-20% of children have it), and high (over 20% of kids are affected). While 95% of White neighborhoods have low or medium rates, 56% of Hispanic neighborhoods have medium or high rates. However, the really striking finding is for Black neighborhoods; 94% have medium or high prevalence. And the racial clustering is even more pronounced if we look only at the high category, where only a tiny proportion (6%) of White neighborhoods fall but nearly half of Black ones do…a nearly mirror image of what we see for the low category:

Photobucket

Rates of asthma and racial/ethnic composition (the color of the circles) mapped onto Chicago neighborhoods (background color represents prevalence of asthma):

Photobucket

Asthma rates don’t seem to be highly clustered by education, but are highly correlated with overall neighborhood incomes:

Photobucket

It’s hard to know exactly what causes higher rates of asthma in Black and Hispanic neighborhoods than in White ones. It could be differences in access to medical care. The researchers found that asthma rates are also higher in neighborhoods that have high rates of violence. Perhaps stress from living in neighborhoods with a lot of violence is leading to more asthma. The authors of the study suggest that parents might keep their children inside more to protect them from violence, leading to more exposure to second-hand smoke and other indoor pollutants (off-gassing from certain types of paints or construction materials, for instance).

Other studies suggest that poorer neighborhoods have worse outdoor environmental conditions, particularly exposure to industries that release toxic air pollutants or store toxic waste, which increase the risk of asthma. Having a parent with asthma increases the chances of having it as well, though the connection there is equally unsure–is there a genetic factor, or does it simply indicate that parents and children are likely to grow up in neighborhoods with similar conditions?

Regardless, it’s clear that some communities — often those with the fewest resources to deal with it — are bearing the brunt of whatever conditions cause childhood asthma.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

I recently moved to a neighborhood that people routinely describe as “bad.” It’s my first time living in such a place. I’ve lived in working class neighborhoods, but never poor ones. I’ve been lucky.

This neighborhood — one, to be clear, that I had the privilege to choose to live in — is genuinely dangerous. There have been 42 shootings within one mile of my house in the last year. Often in broad daylight. Once the murderers fled down my street, careening by my front door in an SUV. One week there were six rapes by strangers — in the street and after home invasions — in seven days. People are robbed, which makes sense to me because people have to eat, but with a level of violence that I find confusing. An 11-year-old was recently arrested for pulling a gun on someone. A man was beaten until he was a quadriplegic. One day 16 people were shot in a park nearby after a parade.

I’ve lived here for a short time and — being white, middle-aged, middle class, and female — I am on the margins of the violence in my streets, and yet I have never been so constantly and excruciatingly aware of my mortality. I feel less of a hold on life itself. It feels so much more fragile, like it could be taken away from me at any time. I am acutely aware that my skin is but paper, my bones brittle, my skull just a shell ripe for bashing. I imagine a bullet sheering through me like I am nothing. That robustness that life used to have, the feeling that it is resilient and that I can count on it to be there for me, that feeling is going away.

So, when I saw the results of a new study showing that only 50% of African American teenagers believe that they will reach 35 years of age, I understood better than I have understood before. Just a tiny — a teeny, teeny, tiny — bit better.

2

I have heard this idea before. A friend who grew up the child of Mexican immigrants in a sketchy urban neighborhood told me that he, as a teenager, didn’t believe he’d make it to 18. I nodded my head and thought “wow,”‘ but I did not understand even a little bit. He would be between the first and second column from the right: 54% of 2nd generation Mexican immigrants expect that they may very well die before 35. I understand him now a tiny — a teeny, teeny tiny — bit better.

Sociologists Tara Warner and Raymond Swisher, the authors of the study, make clear that the consequences of this fatalism are far reaching. If a child does not believe that they might live to see another day, what motivation can there possibly be for investing in the future, for caring for one’s body, for avoiding harmful habits or dangerous activities? Why study? Why bother to see a doctor? Why not do drugs? Why avoid breaking the law?

Why wouldn’t a person put their future at risk — indeed, their very life — if they do not believe in that future, that life, at all?

If we really want to improve the lives of the most vulnerable people in our country, we cannot allow them to live in neighborhoods where desperation is so high that people turn to violence. Dangerous environments breed fatalism, rationally so. And once our children have given up on their own futures, no teachers’ encouragement, no promise that things will get better if they are good, no “up by your bootstraps” rhetoric will make a difference. They think they’re going to be dead, literally.

We need to boost these families with generous economic help, real opportunities, and investment in neighborhood infrastructure and schools. I think we don’t because the people with the power to do so don’t understand — even a teeny, teeny tiny bit — what it feels like to grow up thinking you’ll never grow up. Until they do, and until we decide that this is a form of cruelty that we cannot tolerate, I am sad to say that I feel pretty fatalistic about these children’s futures, too.

Re-posted at Pacific Standard.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

This November, a wave of student activism drew attention to the problem of racism at colleges and universities in the US.  Sparked by protests at the University of Missouri, nicknamed Mizzou, we saw actions at dozens of colleges. It was a spectacular show of strength and solidarity and activists have won many concessions, including new funding, resignations, and promises to rename buildings.

Activists’ grievances are structural — aimed at how colleges are organized and who is in charge, what colleges teach and who does the teaching, and what values are centered and where they come from — but they are also interpersonal. Student activists of color talked about being subject to overtly racist behavior from others and being on the receiving end of microaggressions, seemingly innocuous commentary from others that remind them that they do not, as a Claremont McKenna dean so poorly put it, “fit the mold.” That dean lost her job after that comment. Many student activists seem to embrace the policing of offensive speech, both the hateful and the ignorant kind.

Negative reactions to this activism was immediate and widespread. Much of it served only to affirm the students’ claims: that we are still a racist society and that we, at best, tolerate our young people of color only if they stay “in their place.” Other times, it was confusion about the kind of world these young people seemed to want to live in. Why, some people asked, would anyone — especially a member of a marginalized population — want to shut down free speech?

Well, it may be that the American love of free speech is waning. The Pew Research Center released data measuring attitudes about censorship. They asked Americans whether they thought the government should be able to prevent people from saying things that are “offensive to minorities.” Millennials — that is, today’s college students — are significantly more likely than any other generation to say that they should.

In fact, the data show a steady decrease in the proportion of Americans who are eager to defend speech that is offensive to minorities. Only 12% of the Silent generation is in favor of censorship, compared to 24% of the Baby Boomers, 27% of Gen X, and 40% of Millennials. Notably, women, Democrats, and non-whites are all more likely than their counterparts to be willing to tolerate government control of speech.

4

Americans still stand out among their national peers. Among European Union countries, 49% of citizens are in favor of censorship, compared to 28% of Americans. If the Millennials have anything to say about it, though, that might be changing. Assuming this is a cohort effect and not an age effect (that is, assuming they won’t change their minds as they age), and with the demographic changes this country will see in the next few decades, we may  very soon look more like Europe on this issue than we do now.

Re-posted at Pacific Standard.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

We often think that as long as a white person doesn’t fly the Confederate flag, use the n-word, or show up to a white supremacist rally that they aren’t racist. However, researchers at Harvard and the Ohio State University, among others, have shown that even whites who don’t endorse racist beliefs tend to be biased against non-whites. This bias, though, is implicit: it’s subconscious and activated in decisions we make that are faster than our conscious mind can control.

You can test your own implicit biases here. Millions of people have.

But where do these negative subconscious attitudes come from? And when do they start?

The Kirwan Institute for the study of race and ethnicity has found that we learn them early and often from the mass media. As an example, consider this seemingly harmless digital billboard for Hiperos, a company that works to protect clients against risk online. The ad implies that, as a business, you need to be leery of working with third parties. Of particular risk is exposure to bribery or corruption. Whom can you trust? Who are the people you should be afraid of? Who might be corrupt?

I took a photo of each of the ads as they cycled through. Turns out, the company portrays people you should be worried about as mostly non-white or not-quite-white.

3

Who is untrustworthy? Those that seem exotic: brown people, black people, Asian people, Latinos, Italian “mobsters,” foreigners. 4 5 6

There were comparatively few non-Hispanic whites represented: 7

Of course, this company’s advertising alone could not powerfully influence whom we consider suspicious, but stuff like this — combined with thousands of other images in the news, movies, and television shows — sinks into our subconscious, teaching us implicitly to fear some kinds of people and not others.

For more, see the original post on sociologytoolbox.com.

Todd Beer, PhD is an Assistant Professor at Lake Forest College, a liberal arts college north of Chicago. His blog, SOCIOLOGYtoolbox, is a collection of tools and resources to help instructors teach sociology and build an active sociological imagination.