social psychology

Flashback Friday.

In the contemporary U.S., individuals choose who to marry based on personal preference, but there is a specific script by which those choices become a wedding day. Not everyone follows the script, but everyone knows it: the man decides to ask the woman to marry him, he buys a ring, he arranges a “special” event, he proposes, and she agrees. Many of us grow up dreaming of a day like this.

But this isn’t the only possible way to decide to marry. A reverse script might involve female choice. We can imagine a world in which, instead of hoping to be chosen, women decide to propose and men can only marry if they get asked. Another alternative script might involve no proposal at all, one in which two people discuss marriage and come to a decision together without the pop question and uncertain answer.

Of course, many couples essentially decide to marry through months or years of discussion, but these couples frequently act out the script anyway because, well, it’s so romantic and wonderful.

Or is it?

Andre M. sent in a clip of John Preator, a finalist on a previous season of American Idol. In the clip, he proposes to his girlfriend Erica on Main Street at a Disneyland Resort. The clip exaggerates the patriarchal underpinnings of both marriage and the marriage proposal. It may or may not be real, but it doesn’t really matter for our purposes.

Here it is:

First, Andre says, the spectacle is a shining testament to our commitment to the idea of marriage as an ideal state. Everyone loves marriage! As Andre writes:

A whole rainbow of characters come out of the shadows to push her towards yes, from the smiling Asian janitor, to the African American guy knighted by our hero and his plastic phallus, to the disabled woman who wishes to trade her fate with the bride-to-be.

We are supposed to think: “How wonderful! How sweet! How perfect!” What is made invisible is the fact that, in addition to a potential site of wedded bliss, marriage is the site of the reproduction of patriarchal privilege (especially through women’s disproportionate responsibility for housework and childcare) and heterosexist (still excluding same sex couples). But the audience knows that they are supposed to feel elated for the couple and privileged to witness their special moment (whether they feel these things or not).

Second, the public nature of the proposal put a lot of pressure on her to say “yes.” The audience is asked to participate in urging her to agree to marry him (“come on folks, how about a little encouragement?!”). And the performers, as well as the performance itself, create conditions that look a lot like coercion. Could she have said “no” if she wanted to? As if breaking his heart wouldn’t have been deterrent enough, saying “no” would have disappointed the onlookers and ruined the performance. He put so much work into scripting the proposal and it was very clear what her line was. How many women, with less pressure, have nonetheless felt it difficult or impossible to say “no”?

Okay, so let’s assume that Erica did want to marry John and that they will live happily ever after. And let’s also assume that most marriage proposals in the U.S. do not come with this degree of pressure. The clip is still a nice reminder of (1) just how taken-for-granted marriage is as an ideal state (can you imagine her saying, “I love you more than life itself and I want to be with you forever, but marriage, no thanks!”) and (2) the way that the proposal script puts men in the position of getting to choose and women in the position of having to agree or go off script.

Originally posted in 2010.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

I wake up at 4:55 AM each and every morning. Why? Well, in part, because I can, because I have the freedom to choose at what time I’m going to start my day. This is not true of every day mind you, as many things can change an individual’s schedule or routine. That said, I get up that early, again in part, because when my door most often unlocks, at about 5:15 AM, I don’t want to be in the cell any more where I’ve been for the last number of hours.

I most often choose to eat plain oatmeal with peanut butter, (unless it’s Sunday when the chow hall typically serves eggs, potatoes and toast) because in part I don’t want to experience anymore of the chow hall that I reasonably have to, and because I can afford to eat oatmeal (at $1.00 per pound) and peanut butter (at $2.15 per 16 oz. container) for breakfast.

Work starts at 6:00 AM and I count myself as extremely fortunate to have what we call an industries job. This is an 8-hour a day, 5-days a week, job, in the penitentiary’s industrial laundry. We process linen from the surrounding hospitals, colleges, institutions, etc. Between 1 million and 1 and a half million pounds per month of linen gets processed through our facility. I work in the maintenance department, which is responsible for keeping the equipment running smoothly, maintaining operation of the machinery, scheduling down time for repairs, etc. This job also pays exceedingly well (comparably speaking) as instead of the average monthly income of around $45.00 I earn roughly $150.00 monthly. This has allowed me to maintain regular contact with family through phone calls at 0.16 per minute ($4.80 for a 30-minute phone call) purchase some items to make life more livable through supplementing the food provided from the chow hall with items from canteen / commissary, as well as pay off my restitution and court fees over the last 17-years of roughly $15,000.00 so that should I one day regain an opportunity to live in the community, I’ll be able to start that life without monetary debt.

Typically, around noon I’ll have lunch, which most often gets eaten in that place I’d rather not frequent, the chow hall. Our menu rotates every 3-months (by seasons) with few exceptions, and while that isn’t horrible for a couple of years, when you start passing decades by, it gets redundant and the desire to consume food outside of what gets offered day in and day out grows. I’ve come to think of what I eat as simply fuel.

Between 1 o’clock and 2 o’clock I’m off work and might try to get outside for some sunshine if I’m lucky enough, maybe some exercise, jog around the track or just walk some laps with someone who I need to catch up with for however long. Otherwise it’s reading, studying for work, educational purposes, etc.

Dinner is around 5 PM, that same chow hall that I’d most often rather not go to, however I don’t want to suggest that the food is so bad that we can’t eat it because that’s not the case, many here are well overweight, it’s simply the choices those individuals choose to make in how and what they consume, what level of activity they participate in, whether due to their abilities or basic drive, and what medical conditions may exist in their lives.

During the evening hours I try to write letters, read, call family and friends, maybe attend a function or fundraiser if I’m fortunate enough be involved in something of that nature, educational opportunities, youth outreach programs, etc. For many however, it’s nothing more than watching TV or staring at a blank wall. Again, I’m fortunate, both in my personal agency and my outlook on life.

When I’m asked about “what prison is like” I offer that it is an extremely lonely place, where every moment of every day is dictated for you, and where there’s tremendous opportunities for self-reflection. In the movies, on TV, and through media coverage, you see individuals that get swept up into the justice system and there’s this emphasis on the crime, the trial, entry into prison…then there’s a few scenes of portrayed prison, walking the yard with the tough guys, pumping iron, watching your back in the shower room, etc. and lastly this great experience of being released from prison, back to spending time with family and friends, BBQ’s in the summer-time, and so on and so forth. All very “event orientated” without the day-to-day experiences put on display. In part that’s because you can’t show the day-to-day loneliness, the feelings of exclusion, the feelings of shame and cowardice that accompany an individual’s incarceration. The realization that we’ve not only victimized our actual victims through whatever offense(s) we’ve committed, but we’ve additionally victimized our own families, the community, society as a whole, our friends and loved ones, everyone in fact that we come in contact with. The courts, lawyers, judges, prosecutors, juries, corrections officers, police, detective… and the list goes on and on!

So what do I hope to get across here? For starters, we as prisoners are human beings, individuals who have failed society for whatever reasons and though no excuse relieves us from our poor life decisions, without hope and help to be better people, without redemption, society is all but lost in its entirety through our bad behaviors. In a discussion group with college students not long ago, after describing some of the opportunities available here in the penitentiary in which I reside, one student asked me if we as prisoners deserved such opportunities. I paused before answering that society deserves us to have such opportunities, because if we do not come out of prison with more skills and a more productive mindset then we came in with, we are destined to once again fail society.

This is a day in the life of a prisoner… one who considers himself extremely fortunate in countless ways and for just as many reasons.

Cross-posted at Public Criminology and Rise Up

Trevor is the current President of the Lifers Unlimited Club and a leader of RISE UP! (Reaching Inside to See Everyone’s Unlimited Potential), a youth empowerment program at the Oregon State Penitentiary. To see more writing/advice from the men in RISE UP!, please check out the program’s blog at www.riseuposp.com and feel free to comment there.  They would love to hear from you.

I was on jury duty this week, and the greatest challenge for me was the “David Brooks temptation” to use the experience to expound on the differences in generations and the great changes in culture and character that technology and history have brought.

I did my first tour of duty in the 1970s. Back then you were called for two weeks. Even if you served on a jury, after that trial ended, you went back to the main jury room. If you were lucky, you might be released after a week and a half. Now it’s two days.

What most struck me most this time was the atmosphere in the main room. Now, nobody talks. You’re in a large room with maybe two hundred people, and it’s quieter than a library. Some are reading newspapers or books, but most are on their latops, tablets, and phones. In the 1970s, it wasn’t just that there was no wi-fi, there was no air conditioning. Remember “12 Angry Men”? We’re in the same building. Then, you tried to find others to talk to. Now you try to find a seat near an electric outlet to connect your charger.

2 (1)

I started to feel nostalgic for the old system. People nowadays – all in their own narrow, solipsistic worlds, nearly incapable of ordinary face-to-face sociability. And so on.

But the explanation was much simpler. It was the two-day hitch. In the old system, social ties didn’t grow from strangers seeking out others in the main jury room. It happened when you went to a courtroom for voir dire. You were called down in groups of forty. The judge sketched out the case, and the lawyers interviewed the prospective jurors. From their questions, you learned more about the case, and you learned about your fellow jurors – neighborhood, occupation, family, education, hobbies. You heard what crimes they’d been a victim of.  When judge called a break for bathroom or lunch or some legal matter, you could find the people you had something in common with. And you could talk with anyone about the case, trying to guess what the trial would bring. If you weren’t selected for the jury, you went back to the main jury room, and you continued the conversations there. You formed a social circle that others could join.

This time, on my first day, there were only two calls for voir dire, the clerk as bingo-master spinning the drum with the name cards and calling out the names one by one. My second day, there were no calls. And that was it. I went home having had no conversations at all with any of my fellow jurors. (A woman seated behind me did say, “Can you watch my laptop for a second?” when she went to the bathroom, but I don’t count that as a conversation.)

I would love to have written 800 words here on how New York character had changed since the 1970s.  No more schmoozing. Instead we have iPads and iPhones and MacBooks destroying New York jury room culture – Apple taking over the Apple. People unable or afraid to talk to one another because of some subtle shift in our morals and manners. Maybe I’d even go for the full Brooks and add a few paragraphs telling you what’s really important in life.

But it was really a change in the structure. New York expanded the jury pool by eliminating most exemptions. Doctors, lawyers, politicians, judges – they all have to show up. As a result, jury service is two days instead of two weeks, and if you actually are called to a trial, once you are rejected for the jury or after the trial is over, you go home.

The old system was sort of like the pre-all-volunteer army. You get called up, and you’re thrown together with many kinds of people you’d never otherwise meet. It takes a chunk of time out of your life, but you wind up with some good stories to tell. Maybe we’ve lost something. But if we have lost valuable experiences, it’s because of a change in the rules, in the structure of how the institution is run, not a because of a change in our culture and character.

Cross-posted  at Montclair Socioblog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The governors of Virginia and South Carolina have now taken stands against the Confederate battle flag. So have honchos at Wal*Mart, Sears, Target, and NASCAR.

NASCAR! How could this cascade of reversals have happened so rapidly? Did these important people wake up one morning this week and say to themselves, “Gee, I never realized that there was anything racist about the Confederacy, and never realized that there was anything wrong with racism, till that kid killed nine Black people in a church”?

My guess is that what’s going on is not a sudden enlightenment or even much of a change in views about the flag. To me it looks more like the process of “pluralistic ignorance.” What these people changed was not their ideas about the Confederacy or racism but their ideas about other people’s ideas about these matters. With pluralistic ignorance (a term coined by Floyd Allport nearly a century ago) everyone wants X but thinks that nobody else does. Then some outside factor makes it possible for people to choose X, and everyone does. Everyone is surprised – “Gee, I thought all you guys wanted Y, not X .” It looks like a rapid change in opinion, but it’s not.

A few years ago in places like Ireland and Europe, people were surprised at the success of new laws banning smoking in pubs and restaurants. “Oh, the smokers will never stand for it.” But it turned out that the smokers, too, were quite happy to have rooms with breathable air. It’s just that before the laws were passed, nobody knew that’s how other people felt because those people kept smoking.

The same thing happened when New York City passed a pooper-scooper law. “The law is unenforceable,” people said. “Cops will never see the actual violation, only its aftermath. And do you really think that those selfish New Yorkers will sacrifice their own convenience for some vague public good?” But the law was remarkably effective. As I said in this post from 2009:

Even before the new law, dog owners had probably thought that cleaning up after their dogs was the right thing to do, but since everyone else was leaving the stuff on the sidewalk, nobody wanted to be the only schmuck in New York to be picking up dog shit. In the same way that the no-smoking laws worked because smokers wanted to quit, the dog law in New York worked because dog owners really did agree that they should be cleaning up after their dogs. But prior to the law, none of them would speak or act on that idea.

In South Carolina and Georgia and Bentonville, Arkansas and elsehwere, the governors and the CEOs surely knew that the Confederacy was based on racist slavery; they just rarely thought about it. And if the matter did come up, as with the recent Supreme Court decision about license plates, they probably assumed that most of their constituents and customers were happy with the flag and that the anti-flaggers were a cranky minority.

With the support for letting that flag fade into history, it looks as though for a while now many Southerners may have been uncomfortable with the blatant racism of the Confederacy and the post-Reconstruction era. But because nobody voiced that discomfort, everyone thought that other Southerners still clung to the old mentality. The murders in the Charleston church and the subsequent discussions about retiring the flag may have allowed Southerners to discover that their neighbors shared their misgivings about the old racism. And it allowed the retail giants to see that they weren’t going to lose a lot of money by not stocking the flag.

Cross-posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Flashback Friday.

I’ve posted about the use of apparent discounts as a marketing tool and about the rise of the shopping cart. Since I’m on a little marketing-related posting trend, I figured I might as well post about restaurant menus. New York Magazine recently provided an analysis of menus and how things such as placement, images, and so on influence purchases.

Here’s the menu analyzed in the article:

balthazarmenu091214_560

Some of the most interesting elements numbered on the menu:

1. Pictures of food on menus are tricky. They can convince people to buy a dish, but more expensive restaurants don’t want to be associated with low-cost places like Denny’s or Applebee’s. In general, the more expensive the restaurant, the less likely there are to be images of food, and if there are, they’re drawings, not color photos. And, apparently, the upper right corner is where customers’ eyes go first, so you need to make good use of that section.

2 and 3. You list something expensive (like a $115 seafood dish) in a prominent spot to serve the same function as a “manufacturer’s suggested retail price” on a sales tag at a retail store: to set an anchor price that makes other prices look like a bargain in comparison. The $70 seafood dish listed next to the $115 one seems way more reasonable than it would have it listed without the comparison anchor price.

5. Listing dishes in a column encourages customers to skim down the list, making it more likely that they’ll be focusing on the column of prices rather than the dishes themselves, and will pick from among the cheapest things on the menu. If the dish names are connected by a line of dots or dashes to specific prices, this is even more pronounced.

8. Restaurants often use “bracketing”:

…the same dish comes in different sizes. Here, that’s done with steak tartare and ravioli — but because “you never know the portion size, you’re encouraged to trade up,” Poundstone says. “Usually the smaller size is perfectly adequate.”

Notice the same things I mentioned in my post about meaningless discounts: high prices used to set an anchor that makes everything else look cheap and an emphasis on apparent savings to distract the customer from how much they’re spending.

And the bracketing thing is marketing genius: the larger portion is usually just a little bit more expensive, so the customer is likely to focus on the fact that the additional amount is actually a bargain, but you usually have very little information about how much bigger it actually is.

Knowledge is power! And now you know.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

African Americans are less healthy than their white counterparts. There are lots of causes for this: food deserts, lack of access to healthcare, an absence of recreational opportunities in low income neighborhoods, and more. Arguably, these are indirect effects of racist individuals and institutions, leading to the disinvestment in predominantly black neighborhoods and the economic disempowerment of black people.

This post, though, is about a direct relationship between racism and health mediated by stress. Experiencing discrimination has been shown to have both acute and long-term effects on the body. Being discriminated against changes the biometrics that indicate stress and personal reports of stress (anxiety, depression, and anger). Bad health outcomes are the result.

A new study, published in PLOS One, adds another layer to the accumulating evidence. To get a strong measure of “area racism” — the prevalence of racist beliefs in a specific geographic area — epidemiologist David Chae and his colleagues counted how often internet users searched for the “n-word” on Google (ending in -er or -ers, but not -a or -as). This, they argued, is a good measure of the likelihood that an African American will experience discrimination. Here are their findings for area racism:

2

They then measured the rate at which black people over 25 in those areas die and the death rate from the four most common causes of death for that population: heart disease, cancer, stroke, and diabetes. They also included a series of control variables to attempt to isolate the predictive power of area racism.

The resulting data offer support for the idea that area racism increases mortality among African Americans. Chae and his colleagues summarize, saying that areas in which Google searches for the n-word are one standard deviation above the mean have an 8.2% increase in mortality among Blacks. The searches were related, also, to an increase in the rates of cancer, heart disease, and stroke. “This,” they explain, “amounts to over 30,000 [early] deaths among Blacks annually nationwide.”

When they controlled for area level demographics and socioeconomic variables, the magnitude of the effect dropped from 8.2% to 5.7%. But these factors, they argued, “are also influenced by racial prejudice and discrimination and therefore could be on the causal pathway.” In other words, it’s not NOT racism that’s making up that 2.5% difference.

Directly and indirectly, racism kills.

H/t to Philip Cohen for the link. Cross-posted at Pacific Standard.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Chris Christie’s net worth (at least $4 million) is 50 times that of the average American. His household income of $700,000 (his wife works in the financial sector) is 13 times the national median.  But he doesn’t think he’s rich.

I don’t consider myself a wealthy man. . . . and I don’t think most people think of me that way.

That’s what he told the Manchester Union-Leader on Monday when he was in New Hampshire running for president.

Of course, being out of touch with reality doesn’t automatically disqualify a politician from the Republican nomination, even at the presidential level, though misreading the perceptions of “most people” may be a liability.

But I think I know what Christie meant. He uses the term “wealth,” but what he probably has in mind is class.  He says, “Listen, wealth is defined in a whole bunch of different ways . . . ”  No, Chris. Wealth is measured one way – dollars. It’s social class that is defined in a whole bunch of different ways.

One of those ways, is self-perception.

“If you were asked to use one of four names for your social class, which would you say you belong in: the lower class, the working class, the middle class, or the upper class?”

That question has been part of the General Social Survey since the start in 1972. It’s called “subjective social class.” It stands apart from any objective measures like income or education. If an impoverished person who never got beyond fifth grade says that he’s upper class, that’s what he is, at least on this variable. But he probably wouldn’t say that he’s upper class.

Neither would Chris Christie. But why not?

My guess is that he thinks of himself as “upper middle class,” and since that’s not one of the GSS choices, Christie would say “middle class.”  (Or he’d tell the GSS interviewer where he could stick his lousy survey. The governor prides himself on his blunt and insulting responses to ordinary people who disagree with him.)

1c

This  self-perception as middle class rather than upper can result from “relative deprivation,” a term suggesting that how you think about yourself depends on who are comparing yourself with.* So while most people would not see the governor as “deprived,” Christie himself travels in grander circles. As he says, “My wife and I . . . are not wealthy by current standards.” The questions is “Which standards?”  If the standards are those of the people whose private jets he flies on, the people he talks with in his pursuit of big campaign donations – the Koch brothers, Ken Langone (founder of Home Depot), Sheldon Adelson, Jerry Jones, hedge fund billionaires, et al. – if those are the people he had in mind when he said, “We don’t have nearly that much money,” he’s right. He’s closer in wealth to you and me and middle America than he is to them.

I also suspect that Christie is thinking of social class not so much as a matter of money as of values and lifestyle – one of  that bunch of ways to define class. To be middle class is to be one of those solid Americans – the people who, in Bill Clinton’s phrase, go to work and pay the bills and raise the kids. Christie can see himself as one of those people. Here’s a fuller version of the quote I excerpted above.

Listen, wealth is defined in a whole bunch of different ways and in the end Mary Pat and I have worked really hard, we have done well over the course of our lives, but, you know, we have four children to raise and a lot of things to do.

He and his wife go to work; if they didn’t, their income would drop considerably. They raise the kids, probably in conventional ways rather than sloughing that job off on nannies and boarding schools as upper-class parents might do. And they pay the bills. Maybe they even feel a slight pinch from those bills. The $100,000 they’re shelling out for two kids in private universities may be a quarter of their disposable income, maybe more. They are living their lives by the standards of “middle-class morality.” Their tastes too are probably in line with those of mainstream America. As with income, the difference between the Christies and the average American is one of degree rather than kind. They prefer the same things; they just have a pricier version. Seats at a football game, albiet in the skyboxes, but still drinking a Coors Light. It’s hard to picture the governor demanding a glass of Haut Brion after a day of skiing on the slopes at Gstaad, chatting with (God forbid) Euorpeans.

Most sociological definitions of social class do not include values and lifestyle, relying on more easily measured variables like income, education, and occupation. But for many people, including the governor, morality and consumer preference may weigh heavily in perceptions and self-perceptions of social class.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Flashback Friday.

Yesterday I went to Marshall’s and overheard a conversation between a teenager and her mother that perfectly illustrated what I was planning on posting about. The teen pulled her mom over to look at a purse she wanted for Christmas. It was $148, but she was making a case to her mom that it was actually a great buy compared to how much it would have been at the original price, which, as she pointed out to her mom, was listed as $368.

Ellen Ruppel Shell discusses this topic at length in Cheap: The High Cost of Discount Culture.

It indicates that you are getting a great deal by shopping at Marshall’s compared to the original price of the item.

Except that is not, in fact, what they are saying. The wording is “compare at…” The tags do not say “marked down from” or “original price” or “was.” There is a crucial difference: when you are told to “compare at,” the implication is that the shoes were originally $175, making them a super steal at $49. The “manufacturer’s suggested retail price” (MSRP) gives you the same info.

But as Shell points out, these numbers are largely fictional. Marshall’s is not actually telling you that those shoes were ever sold for $175. You’re just supposed to “compare” $49 to $175. But $175 may be an entirely meaningless number. The shoes may never have been sold for $175 at any store; certainly no specifics are given. Even if they were, the fact that a large number of them ended up at Marshall’s would indicate that many customers didn’t consider $175 an acceptable price.

The same goes for the MSRP: it’s meaningless. Among other things, that’s not how pricing works these days for big retail outlets. The manufacturer doesn’t make a product and then tell the retailer how much they ought to charge for it. Retailers hold much more power than manufacturers; generally, they pressure suppliers to meet their price and to constantly lower costs, putting the burden on the suppliers to figure out how to do so (often by reducing wages). The idea that manufacturers are able to tell Macy’s or Target or other big retailers how much to charge for their items is ridiculous. Rather, the retailer usually tells the manufacturer what MSRP to print on the tag of items they’ll be purchasing (I saw some tags at Marshall’s where it said MSRP but no price had been printed on it).

So what’s the point of a MSRP on a price tag, or a “compare at” number? These numbers serve as “anchor” prices — that is, they set a high “starting” point for the product, so the “sale” price seems like a great deal in comparison. Except the “sale” price isn’t actually a discount at all — it’s only a sale price in comparison to this fictional original price that was developed for the sole purpose of making you think “Holy crap! I can get $175 shoes for just $49!”

The point is to redirect your thinking from “Do I think these shoes are worth $49?” to “I can save $126!” This is a powerful psychological motivator; marketing research shows that people are fairly easily swayed by perceived savings. A sweater we might not think is worth $40 if we saw it at Banana Republic suddenly becomes worth $50 if we see it at Marshall’s (or T.J. Maxx, an outlet mall, Ross, etc.) and are told it used to sell for $80. We focus not on the fact that we’re spending $50, but on the fact that we’re saving $30.

And that makes us feel smart: we’ve beat the system! Instead of going to the mall and paying $368 for that purse, we hunted through the discount retailer and found it for $148! We worked for it, and we were smart enough to not get conned into buying it at the inflated price. Shell describes research that shows that, in these situations, we feel like we didn’t just save that money, we actually earned it by going to the effort to search out deals. When we buy that $148 purse, we’re likely to leave feeling like we’re somehow $220 richer (since we didn’t pay $368) rather than $148 poorer. And we’ll value it more highly because we feel like we were smart to find it; that is, we’re likely to think a $148 purse bought on “sale” is cooler and better quality than we would the identical purse if we bought it at full price for $120.

And stores capitalize on these psychological tendencies by giving us cues that seem to indicate we’re getting an amazing deal. Sometimes we are. But often we’re being distracted with numbers that seem to give us meaningful information but are largely irrelevant, if not entirely fictional.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.