1,007,000 Americans working full-time earn the federal minimum wage of $7.25 per hour. All of that pay, to all of those people, for all of 2014 adds up to $14 billion dollars. And that is less than half of what employees on Wall Street earned in bonuses alone.
This is your image of the week:
Source: Institute for Policy Studies.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
Yesterday I went to Marshall’s to take some photos for this post and overheard a conversation between a teenager and her mother that perfectly illustrated what I was planning on posting about. The teen pulled her mom over to look at a purse she wanted for Christmas. It was $148, but she was making a case to her mom that it was actually a great buy compared to how much it would have been at the original price, which, as she pointed out to her mom, was listed as $368.
Ellen Ruppel Shell discusses this topic at length in Cheap: The High Cost of Discount Culture. Here’s a relevant photo I took:
It indicates that you are getting a great deal by shopping at Marshall’s compared to the original price of the item.
Except that is not, in fact, what they are saying. Look at the image again: the wording is “compare at…” The tags do not say “marked down from” or “original price” or “was.” There is a crucial difference: when you are told to “compare at,” the implication is that the shoes were originally $175, making them a super steal at $49. The “manufacturer’s suggested retail price” (MSRP) gives you the same info.
But as Shell points out, these numbers are largely fictional. Marshall’s is not actually telling you that those shoes were ever sold for $175. You’re just supposed to “compare” $49 to $175. But $175 may be an entirely meaningless number. The shoes may never have been sold for $175 at any store; certainly no specifics are given. Even if they were, the fact that a large number of them ended up at Marshall’s would indicate that many customers didn’t consider $175 an acceptable price.
The same goes for the MSRP: it’s meaningless. Among other things, that’s not how pricing works these days for big retail outlets. The manufacturer doesn’t make a product and then tell the retailer how much they ought to charge for it. Retailers hold much more power than manufacturers; generally, they pressure suppliers to meet their price and to constantly lower costs, putting the burden on the suppliers to figure out how to do so (often by reducing wages). The idea that manufacturers are able to tell Macy’s or Target or other big retailers how much to charge for their items is ridiculous. Rather, the retailer usually tells the manufacturer what MSRP to print on the tag of items they’ll be purchasing (I saw some tags at Marshall’s where it said MSRP but no price had been printed on it).
So what’s the point of a MSRP on a price tag, or a “compare at” number? These numbers serve as “anchor” prices — that is, they set a high “starting” point for the product, so the “sale” price seems like a great deal in comparison. Except the “sale” price isn’t actually a discount at all — it’s only a sale price in comparison to this fictional original price that was developed for the sole purpose of making you think “Holy crap! I can get $175 shoes for just $49!”
The point is to redirect your thinking from “Do I think these shoes are worth $49?” to “I can save $126!” This is a powerful psychological motivator; marketing research shows that people are fairly easily swayed by perceived savings. A sweater we might not think is worth $40 if we saw it at Banana Republic suddenly becomes worth $50 if we see it at Marshall’s (or T.J. Maxx, an outlet mall, Ross, etc.) and are told it used to sell for $80. We focus not on the fact that we’re spending $50, but on the fact that we’re saving $30.
And that makes us feel smart: we’ve beat the system! Instead of going to the mall and paying $368 for that purse, we hunted through the discount retailer and found it for $148! We worked for it, and we were smart enough to not get conned into buying it at the inflated price. Shell describes research that shows that, in these situations, we feel like we didn’t just save that money, we actually earned it by going to the effort to search out deals. When we buy that $148 purse, we’re likely to leave feeling like we’re somehow $220 richer (since we didn’t pay $368) rather than $148 poorer. And we’ll value it more highly because we feel like we were smart to find it; that is, we’re likely to think a $148 purse bought on “sale” is cooler and better quality than we would the identical purse if we bought it at full price for $120.
And stores capitalize on these psychological tendencies by giving us cues that seem to indicate we’re getting an amazing deal. Sometimes we are. But often we’re being distracted with numbers that seem to give us meaningful information but are largely irrelevant, if not entirely fictional.
Originally posted in 2009.Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.
Every year the National Priorities Project helps Americans understand how the money they paid in federal taxes was spent. Here’s the data for 2014:
Since the 1940s, individual Americans have paid 40-50% of the federal government’s bills through taxes on income and investment. Another chunk (about 1/3rd today) is paid in the form of payroll taxes for things like social security and medicare. This year, corporate taxes made up only about 11% of the federal government’s revenue; this is way down from a historic high of almost 40% in 1943.
Visit the National Priorities Project here and find out where state tax dollars went, how each state benefits from federal tax dollars, and who gets the biggest tax breaks. Or fiddle around with how you would organize American priorities.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
The Winners and the Losers
Within the last decade, the grain quinoa has emerged as an alleged “super food” in western dietary practices. Health food stores and upscale grocery chains have aisles dedicated to varieties of quinoa, packaged under many different brand labels, touting it to be a nutritional goldmine. A simple Google search of the word returns pages of results with buzzwords like “healthiest,” “organic,” and “wholesome.” Vegan and health-enthusiast subcultures swear by this expensive food product, and the Food and Agricultural Organization (FAO) even declared the year 2013 International Year of the Quinoa, owing to the grain’s popularity.
The journey of the grain — as it makes it to the gourmet kitchen at upscale restaurants in countries like the United States — however, is often overlooked in mainstream discourse. It often begins in the Yellow Andes region of Bolivia, where the farmers that grow this crop have depended on it as almost a sole nutritional source for decades, if not centuries. The boom in western markets, with exceedingly high demands for this crop has caused it to transition from a traditional food crop to a major cash crop.
While critical global organizations like the FAO have been portraying this as positive, they tend to discount the challenge of participating in a demanding global market. Within-country inequality, skewed export/import dynamics, and capitalist trade practices that remain in the favor of the powerful player in these dynamics – the core consumer – cause new and difficult problems for Bolivian farmers, like not being able to afford to buy the food they have traditionally depended upon.
Meanwhile, growing such large amounts of quinoa has been degrading the Andean soil: even the FAO outlines concerns for biodiversity, while otherwise touting the phenomenon.
While efforts have been put in place by farmer unions, cooperatives and development initiatives to mitigate some negative effects on the primary producers of quinoa, they have not been enough to protect the food security of these Andean farmers. Increased consumer consciousness is therefore essential in ensuring that these farmers don’t continue to suffer because of Western dietary fads.
Cross-posted at Sociology Lens.
Aarushi Bhandari is a doctoral student at Stony Brook University interested in globalization and the impact of neoliberal policies on the developing world. She wants to study global food security within a global neoliberal framework and the world systems perspective.
In the working and middle class neighborhoods of many Southern cities, you fill find rows of “shotgun” houses. These houses are long and narrow, consisting of three or more rooms in a row. Originally, there would have been no indoor plumbing — they date back to the early 1800s in the U.S. — and, so, no bathroom or kitchen.
Here’s a photograph of a shotgun house I took in the 7th ward of New Orleans. It gives you an idea of just how skinny they are.
In a traditional shotgun house, there are no hallways, just doors that take a person from one room to the next. Here’s my rendition of a shotgun floor plan; doors are usually all in a row:
At nola.com, Richard Campanella describes the possible origins and sociological significance of this housing form. He follows folklorist John Michael Vlach, who has argued that shotgun houses are indigenous to Western and Central Africa, arriving in the American South via Haiti. Campella writes:
Vlach hypothesizes that the 1809 exodus of Haitians to New Orleans after the St. Domingue slave insurrection of 1791 to 1803 brought this vernacular house type to the banks of the Mississippi.
In New Orleans, shotgun houses are found in the parts of town originally settled by free people of color, people who would have identified as Creole, and a variety of immigrants. Outside of New Orleans, we tend to see shotgun houses in places with large black populations.
The house, though, doesn’t just represent a building technique, it tells a story about how families were expected to interact. Shotgun houses offer essentially zero privacy. Everyone has to tromp through everyone’s room to get around the house. There’s no expectation that a child won’t just walk into their parents’ room at literally any time, or vice versa. There’s no way around it.
“According to some theories,” then, Campanella says:
…cultures that produced shotgun houses… tended to be more gregarious, or at least unwilling to sacrifice valuable living space for the purpose of occasional passage.
Cultures that valued privacy, on the other hand, were willing to make this trade-off.
Sure enough, in the part of New Orleans settled by people of Anglo-Saxon descent, shotgun houses are much less common and, instead, homes are more “privacy-conscious.”
Over time, as even New Orleans became more and more culturally Anglo-Saxon — and as the housing form increasingly became associated with poverty — shotguns fell out of favor. They’re enjoying a renaissance today but, as Campanella notes, many renovations of these historic buildings include a fancy, new hallway.
Cross-posted at A Nerd’s Guide to New Orleans.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.