In the U.S., we recognize two main party platforms: Republican and Democrat. Each party packages together specific positions on economic and social issues together into ideologies we call conservative and liberal. The desire for a small government, for example, is lumped with opposition to same sex marriage, while believing in a larger role for government is lumped together with support for abortion rights.

Do all people neatly fit into these two packages? And, if not, what are the consequences for electoral outcomes?

In the American Journal of Sociology, Delia Baldassarri and Amir Goldberg use 20 years of data (1984–2004) from the National Election Studies to show that many Americans have consistent and logical political ideas that don’t align with either major party’s ideological package. These voters, whom the authors call alternatives, are socially liberal and economically conservative (or vice versa).

The images below show correlations between social and economic liberalism or social and economic conservativism. Strong correlations are dark and weak are light. The top image is of the opinions of ideologues — those who adhere pretty closely to the existing liberal and conservative packages — and the bottom images shows the opinions of alternatives.

111 222

In this data, being an alternatives is not just about being unfocused or uncommitted. Baldassarri and Goldberg show that their positions are logical, reasoned, consistent, and remain steady over time.  The study makes it clear that the ties between economic and social issues made by the left and the right, which many people see as normal or natural, represent just two among the many belief systems that Americans actually hold.

When it comes to the ballot box, though, alternatives usually vote Republican. The authors write that the most conservative among the alternatives’ views tend to hold sway when it comes to picking a party. It appears that the salience of moral issues is not the primary reason for Republicans’ electoral success. Instead, for as-yet unknown reasons, alternative voters follow their more conservative leanings at the ballot, whether economic or social.

Cross-posted at The Reading List.

Jack Delehanty is a graduate student in sociology at the University of Minnesota. His work is about how social movement organizations can reframe dominant social narratives about inequality. In his dissertation, he explores how white Protestant-influenced discourses of poverty, family, and individual choice are being critically reshaped in the public sphere today.

Chris Christie’s net worth (at least $4 million) is 50 times that of the average American. His household income of $700,000 (his wife works in the financial sector) is 13 times the national median.  But he doesn’t think he’s rich.

I don’t consider myself a wealthy man. . . . and I don’t think most people think of me that way.

That’s what he told the Manchester Union-Leader on Monday when he was in New Hampshire running for president.

Of course, being out of touch with reality doesn’t automatically disqualify a politician from the Republican nomination, even at the presidential level, though misreading the perceptions of “most people” may be a liability.

But I think I know what Christie meant. He uses the term “wealth,” but what he probably has in mind is class.  He says, “Listen, wealth is defined in a whole bunch of different ways . . . ”  No, Chris. Wealth is measured one way – dollars. It’s social class that is defined in a whole bunch of different ways.

One of those ways, is self-perception.

“If you were asked to use one of four names for your social class, which would you say you belong in: the lower class, the working class, the middle class, or the upper class?”

That question has been part of the General Social Survey since the start in 1972. It’s called “subjective social class.” It stands apart from any objective measures like income or education. If an impoverished person who never got beyond fifth grade says that he’s upper class, that’s what he is, at least on this variable. But he probably wouldn’t say that he’s upper class.

Neither would Chris Christie. But why not?

My guess is that he thinks of himself as “upper middle class,” and since that’s not one of the GSS choices, Christie would say “middle class.”  (Or he’d tell the GSS interviewer where he could stick his lousy survey. The governor prides himself on his blunt and insulting responses to ordinary people who disagree with him.)

1c

This  self-perception as middle class rather than upper can result from “relative deprivation,” a term suggesting that how you think about yourself depends on who are comparing yourself with.* So while most people would not see the governor as “deprived,” Christie himself travels in grander circles. As he says, “My wife and I . . . are not wealthy by current standards.” The questions is “Which standards?”  If the standards are those of the people whose private jets he flies on, the people he talks with in his pursuit of big campaign donations – the Koch brothers, Ken Langone (founder of Home Depot), Sheldon Adelson, Jerry Jones, hedge fund billionaires, et al. – if those are the people he had in mind when he said, “We don’t have nearly that much money,” he’s right. He’s closer in wealth to you and me and middle America than he is to them.

I also suspect that Christie is thinking of social class not so much as a matter of money as of values and lifestyle – one of  that bunch of ways to define class. To be middle class is to be one of those solid Americans – the people who, in Bill Clinton’s phrase, go to work and pay the bills and raise the kids. Christie can see himself as one of those people. Here’s a fuller version of the quote I excerpted above.

Listen, wealth is defined in a whole bunch of different ways and in the end Mary Pat and I have worked really hard, we have done well over the course of our lives, but, you know, we have four children to raise and a lot of things to do.

He and his wife go to work; if they didn’t, their income would drop considerably. They raise the kids, probably in conventional ways rather than sloughing that job off on nannies and boarding schools as upper-class parents might do. And they pay the bills. Maybe they even feel a slight pinch from those bills. The $100,000 they’re shelling out for two kids in private universities may be a quarter of their disposable income, maybe more. They are living their lives by the standards of “middle-class morality.” Their tastes too are probably in line with those of mainstream America. As with income, the difference between the Christies and the average American is one of degree rather than kind. They prefer the same things; they just have a pricier version. Seats at a football game, albiet in the skyboxes, but still drinking a Coors Light. It’s hard to picture the governor demanding a glass of Haut Brion after a day of skiing on the slopes at Gstaad, chatting with (God forbid) Euorpeans.

Most sociological definitions of social class do not include values and lifestyle, relying on more easily measured variables like income, education, and occupation. But for many people, including the governor, morality and consumer preference may weigh heavily in perceptions and self-perceptions of social class.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Sociologists are quite familiar with the combination of marginalized identities that can lead to oppression, inequalities, and “double disadvantages.” But can negative stereotypes actually have positive consequences?

Financial Juneteenth recently highlighted a study showing that gay black men may have better odds of landing a job and higher salaries than their straight, black, male colleagues. Led by sociologist David Pedulla, the data comes from resumes and a job description evaluated by 231 white individuals selected in a national probability sample. The experiment asked them to suggest starting salaries for the position and answer questions about the fictional prospective employee. To suggest race and sexual orientation, resumes included typically raced names (either “Brad Miller” and “Darnell Jackson”) and listed participation in “Gay Student Advisory Council” half the time.

Pedulla found that straight Black men were more likely to be perceived as threatening, measured with answers as to whether the respondent thought the applicant was likely to “break workplace rules,” make “female co-workers feel uncomfortable,’’ or “steal from the workplace.” In contrast, gay Black men were considered by far the least threatening. Gay black men were also judged to be the most feminine, followed by gay white men.

Perhaps most surprisingly, the combination of being gay, Black, and male attracted the highest salaries. Gay Black men were considered the most valuable employee overall. Straight white men were offered slightly lower salaries and gay white men and straight black men were offered lowered salaries still.

333

Pedulla’s findings have sparked a conversation among scholars and journalists about the complexity of stereotypes surrounding black masculinities and sexualities. Organizational behavior researcher and Huffington Post contributor Jon Fitzgerald Gates also weighed in on the findings, arguing that the effeminate stereotypes of homosexuality may be counteracting the traditional stereotypes of a dangerous and threatening black heterosexual masculinity.

Cross-posted at Citings and Sightings.

Caty Taborda is a graduate student in sociology at the University of Minnesota, where she’s on the Grad Editorial Board for The Society Pages. Her research concerns the intersection of gender, race, health, and the body. You can follow her on twitter.

1,007,000 Americans working full-time earn the federal minimum wage of $7.25 per hour. All of that pay, to all of those people, for all of 2014 adds up to $14 billion dollars. And that is less than half of what employees on Wall Street earned in bonuses alone.

This is your image of the week:

4

Source: Institute for Policy Studies.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Flashback Friday.

Yesterday I went to Marshall’s and overheard a conversation between a teenager and her mother that perfectly illustrated what I was planning on posting about. The teen pulled her mom over to look at a purse she wanted for Christmas. It was $148, but she was making a case to her mom that it was actually a great buy compared to how much it would have been at the original price, which, as she pointed out to her mom, was listed as $368.

Ellen Ruppel Shell discusses this topic at length in Cheap: The High Cost of Discount Culture.

It indicates that you are getting a great deal by shopping at Marshall’s compared to the original price of the item.

Except that is not, in fact, what they are saying. The wording is “compare at…” The tags do not say “marked down from” or “original price” or “was.” There is a crucial difference: when you are told to “compare at,” the implication is that the shoes were originally $175, making them a super steal at $49. The “manufacturer’s suggested retail price” (MSRP) gives you the same info.

But as Shell points out, these numbers are largely fictional. Marshall’s is not actually telling you that those shoes were ever sold for $175. You’re just supposed to “compare” $49 to $175. But $175 may be an entirely meaningless number. The shoes may never have been sold for $175 at any store; certainly no specifics are given. Even if they were, the fact that a large number of them ended up at Marshall’s would indicate that many customers didn’t consider $175 an acceptable price.

The same goes for the MSRP: it’s meaningless. Among other things, that’s not how pricing works these days for big retail outlets. The manufacturer doesn’t make a product and then tell the retailer how much they ought to charge for it. Retailers hold much more power than manufacturers; generally, they pressure suppliers to meet their price and to constantly lower costs, putting the burden on the suppliers to figure out how to do so (often by reducing wages). The idea that manufacturers are able to tell Macy’s or Target or other big retailers how much to charge for their items is ridiculous. Rather, the retailer usually tells the manufacturer what MSRP to print on the tag of items they’ll be purchasing (I saw some tags at Marshall’s where it said MSRP but no price had been printed on it).

So what’s the point of a MSRP on a price tag, or a “compare at” number? These numbers serve as “anchor” prices — that is, they set a high “starting” point for the product, so the “sale” price seems like a great deal in comparison. Except the “sale” price isn’t actually a discount at all — it’s only a sale price in comparison to this fictional original price that was developed for the sole purpose of making you think “Holy crap! I can get $175 shoes for just $49!”

The point is to redirect your thinking from “Do I think these shoes are worth $49?” to “I can save $126!” This is a powerful psychological motivator; marketing research shows that people are fairly easily swayed by perceived savings. A sweater we might not think is worth $40 if we saw it at Banana Republic suddenly becomes worth $50 if we see it at Marshall’s (or T.J. Maxx, an outlet mall, Ross, etc.) and are told it used to sell for $80. We focus not on the fact that we’re spending $50, but on the fact that we’re saving $30.

And that makes us feel smart: we’ve beat the system! Instead of going to the mall and paying $368 for that purse, we hunted through the discount retailer and found it for $148! We worked for it, and we were smart enough to not get conned into buying it at the inflated price. Shell describes research that shows that, in these situations, we feel like we didn’t just save that money, we actually earned it by going to the effort to search out deals. When we buy that $148 purse, we’re likely to leave feeling like we’re somehow $220 richer (since we didn’t pay $368) rather than $148 poorer. And we’ll value it more highly because we feel like we were smart to find it; that is, we’re likely to think a $148 purse bought on “sale” is cooler and better quality than we would the identical purse if we bought it at full price for $120.

And stores capitalize on these psychological tendencies by giving us cues that seem to indicate we’re getting an amazing deal. Sometimes we are. But often we’re being distracted with numbers that seem to give us meaningful information but are largely irrelevant, if not entirely fictional.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Every year the National Priorities Project helps Americans understand how the money they paid in federal taxes was spent. Here’s the data for 2014:

2

Since the 1940s, individual Americans have paid 40-50% of the federal government’s bills through taxes on income and investment. Another chunk (about 1/3rd today) is paid in the form of payroll taxes for things like social security and medicare. This year, corporate taxes made up only about 11% of the federal government’s revenue; this is way down from a historic high of almost 40% in 1943.

3

Visit the National Priorities Project here and find out where state tax dollars went, how each state benefits from federal tax dollars, and who gets the biggest tax breaks. Or fiddle around with how you would organize American priorities.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

1 (3)The Numbers

Some History

The Winners and the Losers

Tax Cultures

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Within the last decade, the grain quinoa has emerged as an alleged “super food” in western dietary practices. Health food stores and upscale grocery chains have aisles dedicated to varieties of quinoa, packaged under many different brand labels, touting it to be a nutritional goldmine. A simple Google search of the word returns pages of results with buzzwords like “healthiest,” “organic,” and “wholesome.” Vegan and health-enthusiast subcultures swear by this expensive food product, and the Food and Agricultural Organization (FAO) even declared the year 2013 International Year of the Quinoa, owing to the grain’s popularity.

The journey of the grain — as it makes it to the gourmet kitchen at upscale restaurants in countries like the United States — however, is often overlooked in mainstream discourse. It often begins in the Yellow Andes region of Bolivia, where the farmers that grow this crop have depended on it as almost a sole nutritional source for decades, if not centuries. The boom in western markets, with exceedingly high demands for this crop has caused it to transition from a traditional food crop to a major cash crop.

While critical global organizations like the FAO have been portraying this as positive, they tend to discount the challenge of participating in a demanding global market. Within-country inequality, skewed export/import dynamics, and capitalist trade practices that remain in the favor of the powerful player in these dynamics – the core consumer – cause new and difficult problems for Bolivian farmers, like not being able to afford to buy the food they have traditionally depended upon.

2 3

Meanwhile, growing such large amounts of quinoa has been degrading the Andean soil: even the FAO outlines concerns for biodiversity, while otherwise touting the phenomenon.

7

While efforts have been put in place by farmer unions, cooperatives and development initiatives to mitigate some negative effects on the primary producers of quinoa, they have not been enough to protect the food security of these Andean farmers. Increased consumer consciousness is therefore essential in ensuring that these farmers don’t continue to suffer because of Western dietary fads.

Cross-posted at Sociology Lens.

Aarushi Bhandari is a doctoral student at Stony Brook University interested in globalization and the impact of neoliberal policies on the developing world. She wants to study global food security within a global neoliberal framework and the world systems perspective.