Flashback Friday.

The AP has an interesting website about wildfires from 2002 to 2006. Each year, most wildfires occurred west of the Continental Divide. Many of these areas are forested. Others are desert or shortgrass prairie.

There are a lot of reasons for wildfires–climate and ecology, periodic droughts, humans. The U.S. Fish and Wildlife Service reports that in the Havasu National Wildlife Refuge, the “vast majority” of wildfires are due to human activity. Many scientists expect climate change to increase wildfires.

Many wildfires affect land managed by the Bureau of Land Management. For most of the 1900s, the BLM had a policy of total fire suppression to protect valuable timber and private property.

Occasional burns were part of forest ecology. Fires came through, burning forest litter relatively quickly, then moving on or dying out. Healthy taller trees were generally unaffected; their branches were often out of the reach of flames and bark provided protection. Usually the fire moved on before trees had ignited. And some types of seeds required exposure to a fire to sprout.

Complete fire suppression allowed leaves, pine needles, brush, fallen branches, etc., to build up. Wildfires then became more intense and destructive: they were hotter, flames reached higher, and thicker layers of forest litter meant the fire lingered longer.

As a result, an uncontrolled wildfire was often more destructive. Trees were more likely to burn or to smolder and reignite a fire several days later. Hotter fires with higher flames are more dangerous to fight, and can also more easily jump naturally-occurring or artificial firebreaks. They may burn a larger area than they would otherwise, and thus do more of the damage that total fire suppression policies were supposed to prevent.

In the last few decades the BLM has recognized the importance of occasional fires in forest ecology. Fires are no longer seen as inherently bad. In some areas “controlled burns” are set to burn up some of the dry underbrush and mimic the effects of naturally-occurring fires.

But it’s not easy to undo decades of fire suppression. A controlled burn sometimes turns out to be hard to control, especially with such a buildup of forest litter. Property owners often oppose controlled burns because they fear the possibility of one getting out of hand. So the policy of fire suppression has in many ways backed forest managers into a corner: it led to changes in forests that make it difficult to change course now, even though doing so might reduce the destructive effects of wildfires when they do occur.

Given this, I’m always interested when wildfires are described as “natural disasters.” What makes something a natural disaster? The term implies a destructive situation that is not human-caused but rather emerges from “the environment.” As the case of wildfires shows, the situation is often more complex than this, because what appear to be “natural” processes are often affected by humans… and because we are, of course, part of the environment, despite the tendency to think of human societies and “nature” as separate entities.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

My great-grandma would put a few drops of turpentine on a sugar cube as a cure-all for any type of cough or respiratory ailment. Nobody in the family ever had any obvious negative effects from it as far as I know. And once when I had a sinus infection my grandma suggested that I try gargling kerosene. I decided to go to the doctor for antibiotics instead, but most of my relatives thought that was a perfectly legitimate suggestion.

In the not-so-recent history, lots of substances we consider unhealthy today were marketed and sold for their supposed health benefits. Joe A. of Human Rights Watch sent in these images of vintage products that openly advertised that they contained cocaine or heroin. Perhaps you would like some Bayer Heroin?

Flickr Creative Commons, dog 97209

The Vapor-ol alcohol and opium concoction was for treating asthma:

Cocaine drops for the kids:

A reader named Louise sent in a recipe from her great-grandma’s cookbook. Her great-grandmother was a cook at a country house in England. The recipe is dated 1891 and calls for “tincture of opium”. The recipe (with original spellings):

Hethys recipe for cough mixture

1 pennyworth of each
Antimonial Wine
Acetic Acid
Tincture of opium
Oil of aniseed
Essence of peppermint
1/2lb best treacle

Well mix and make up to Pint with water.

As Joe says, it’s no secret that products with cocaine, marijuana, opium, and other now-banned substances were at one time sold openly, often as medicines. The changes in attitudes toward these products, from entirely acceptable and even beneficial to inherently harmful and addicting, is a great example of social construction. While certainly opium and cocaine have negative effects on some people, so do other substances that remained legal (or were re-legalized, in the case of alcohol).

Often racist and anti-immigrant sentiment played a role in changing views of what are now illegal controlled substances; for instance, the association of opium with Chinese immigrants contributed to increasingly negative attitudes toward it as anything associated with Chinese immigrants was stigmatized, particularly in the western U.S. This combined with a push by social reformers to prohibit a variety of substances, leading to the Harrison Narcotic Act. The act, passed in 1914, regulated production and distribution of opium but, in its application, eventually basically criminalized it.

Reformers pushing for cocaine to be banned suggested that its effects led Black men to rape White women, and that it gave them nearly super-human strength that allowed them to kill Whites more effectively. A similar argument was made about Mexicans and marijuana:

A Texas police captain summed up the problem: under marijuana, Mexicans became “very violent, especially when they become angry and will attack an officer even if a gun is drawn on him. They seem to have no fear, I have also noted that under the influence of this weed they have enormous strength and that it will take several men to handle one man while under ordinary circumstances one man could handle him with ease.”

So the story of the criminalization of some substances in the U.S. is inextricably tied to various waves of anti-immigrant and racist sentiment. Some of the same discourse–the “super criminal” who is impervious to pain and therefore especially violent and dangerous, the addicted mother who harms and even abandons her child to prostitute herself as a way to get drugs–resurfaced as crack cocaine emerged in the 1980s and was perceived as the drug of choice of African Americans.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Back when I was in high school and college, I learned that one of the major things that separated humans from other species was culture. The ability to develop distinct ways of living that include an understanding of symbols, language, and customs unique to the group was a specifically human trait.

And, ok, so it turned out that other species had more complex communication systems than we thought they did, but still, other animals were assumed to behave according to instinct, not community-specific cultures.

But as with so many things humans have been convinced we alone possess, it’s turning out that other species have cultures, too. One of the clearest examples is the division of orcas into two groups with distinct customs and eating habits; one eats mammals while the other is pescetarian, eating only fish. Though the two groups regularly come in contact with each other in the wild, they do not choose to intermingle or mate with one another. Here’s a video:

 

Aside from the obvious implications for our understanding of culture, this brings up an issue in terms of conservation. Take the case of orcas. Some are suggesting that they should be on the endangered species list because the population has declined. What do we do if it turns out at some point that, while the overall orca population is not fully endangered, one of the distinct orca cultural groups is? Is it enough that killer whales still exist, or do we need to think of the cultures separately and try to preserve sufficient numbers of each? In addition to being culturally different, they are functionally non-interchangeable: each group has a different effect on food chains and ecosystems.

Should conservation efforts address not just keeping the overall population alive and functioning, but ensure that the range of cultural diversity within a species is protected? If this situation occurred, should we declare one orca culture as endangered but not the other? Are both ecological niches important?

I love these questions. If we recognize that creatures can have cultures, it challenges our sense of self, but also brings significantly more complexity to the idea of wildlife preservation.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

I recently came upon the Jewish greeting card section at Target, way down on the bottom row. I could tell it was the Jewish section because all of the dividers that tell you what kind of card is in that slot (birthday, anniversary, etc.) had a Star of David on them.

I was interested in what a specifically Jewish birthday card might look like, so I picked this one up. It draws on the idea that Jewish people are particularly prone to feeling guilty.

 

The inside said:

…but is cake and ice cream mentioned anywhere? I think NOT! It’s your day! Enjoy! Enjoy!

Mary Waters found that people often believe that ethnicity explains all types of behaviors that are in fact very widespread. She interviewed White ethnics in the U.S.; they often attributed their families’ characteristics to their ethnicity. Take the idea of the loud, boisterous family, often including a mother who is constantly trying to get the kids to eat more of her homecooked meals and worrying if they aren’t married. Many individuals described their family this way and claimed that their ethnicity was the reason.

People who identified their background as Italian, Greek, Jewish, Polish, and others all believed that the way their family interacted was a unique custom of their ethnic group. Yet they all described pretty much the same characteristics. The cardmakers’ (and others’) allusion to guilt to signify Jewishness seems to me to fall into this category: take out the Stars of David and I bet a range of religious/ethnic groups would think it was tailored to them specifically.

So you take a card, say guilt in it, add a Star of David, and you’ve got a Jewish card. Take out the Star of David, maybe it’s a Catholic card, especially if you added a cross, since they’re often portrayed as feeling a lot of guilt. I’ve had friends who grew up Southern Baptist or Pentecostal joke about having felt guilty about everything, so you could market the card to them, too! I think it’s a good example of how we often treat characteristics or behaviors as somehow meaningfully connected to a specific ethnic background rather than being a pretty common way that people in general, across ethnic lines, behave.

Originally published in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

I’ve posted about the use of apparent discounts as a marketing tool and about the rise of the shopping cart. Since I’m on a little marketing-related posting trend, I figured I might as well post about restaurant menus. New York Magazine recently provided an analysis of menus and how things such as placement, images, and so on influence purchases.

Here’s the menu analyzed in the article:

balthazarmenu091214_560

Some of the most interesting elements numbered on the menu:

1. Pictures of food on menus are tricky. They can convince people to buy a dish, but more expensive restaurants don’t want to be associated with low-cost places like Denny’s or Applebee’s. In general, the more expensive the restaurant, the less likely there are to be images of food, and if there are, they’re drawings, not color photos. And, apparently, the upper right corner is where customers’ eyes go first, so you need to make good use of that section.

2 and 3. You list something expensive (like a $115 seafood dish) in a prominent spot to serve the same function as a “manufacturer’s suggested retail price” on a sales tag at a retail store: to set an anchor price that makes other prices look like a bargain in comparison. The $70 seafood dish listed next to the $115 one seems way more reasonable than it would have it listed without the comparison anchor price.

5. Listing dishes in a column encourages customers to skim down the list, making it more likely that they’ll be focusing on the column of prices rather than the dishes themselves, and will pick from among the cheapest things on the menu. If the dish names are connected by a line of dots or dashes to specific prices, this is even more pronounced.

8. Restaurants often use “bracketing”:

…the same dish comes in different sizes. Here, that’s done with steak tartare and ravioli — but because “you never know the portion size, you’re encouraged to trade up,” Poundstone says. “Usually the smaller size is perfectly adequate.”

Notice the same things I mentioned in my post about meaningless discounts: high prices used to set an anchor that makes everything else look cheap and an emphasis on apparent savings to distract the customer from how much they’re spending.

And the bracketing thing is marketing genius: the larger portion is usually just a little bit more expensive, so the customer is likely to focus on the fact that the additional amount is actually a bargain, but you usually have very little information about how much bigger it actually is.

Knowledge is power! And now you know.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Yesterday I went to Marshall’s and overheard a conversation between a teenager and her mother that perfectly illustrated what I was planning on posting about. The teen pulled her mom over to look at a purse she wanted for Christmas. It was $148, but she was making a case to her mom that it was actually a great buy compared to how much it would have been at the original price, which, as she pointed out to her mom, was listed as $368.

Ellen Ruppel Shell discusses this topic at length in Cheap: The High Cost of Discount Culture.

It indicates that you are getting a great deal by shopping at Marshall’s compared to the original price of the item.

Except that is not, in fact, what they are saying. The wording is “compare at…” The tags do not say “marked down from” or “original price” or “was.” There is a crucial difference: when you are told to “compare at,” the implication is that the shoes were originally $175, making them a super steal at $49. The “manufacturer’s suggested retail price” (MSRP) gives you the same info.

But as Shell points out, these numbers are largely fictional. Marshall’s is not actually telling you that those shoes were ever sold for $175. You’re just supposed to “compare” $49 to $175. But $175 may be an entirely meaningless number. The shoes may never have been sold for $175 at any store; certainly no specifics are given. Even if they were, the fact that a large number of them ended up at Marshall’s would indicate that many customers didn’t consider $175 an acceptable price.

The same goes for the MSRP: it’s meaningless. Among other things, that’s not how pricing works these days for big retail outlets. The manufacturer doesn’t make a product and then tell the retailer how much they ought to charge for it. Retailers hold much more power than manufacturers; generally, they pressure suppliers to meet their price and to constantly lower costs, putting the burden on the suppliers to figure out how to do so (often by reducing wages). The idea that manufacturers are able to tell Macy’s or Target or other big retailers how much to charge for their items is ridiculous. Rather, the retailer usually tells the manufacturer what MSRP to print on the tag of items they’ll be purchasing (I saw some tags at Marshall’s where it said MSRP but no price had been printed on it).

So what’s the point of a MSRP on a price tag, or a “compare at” number? These numbers serve as “anchor” prices — that is, they set a high “starting” point for the product, so the “sale” price seems like a great deal in comparison. Except the “sale” price isn’t actually a discount at all — it’s only a sale price in comparison to this fictional original price that was developed for the sole purpose of making you think “Holy crap! I can get $175 shoes for just $49!”

The point is to redirect your thinking from “Do I think these shoes are worth $49?” to “I can save $126!” This is a powerful psychological motivator; marketing research shows that people are fairly easily swayed by perceived savings. A sweater we might not think is worth $40 if we saw it at Banana Republic suddenly becomes worth $50 if we see it at Marshall’s (or T.J. Maxx, an outlet mall, Ross, etc.) and are told it used to sell for $80. We focus not on the fact that we’re spending $50, but on the fact that we’re saving $30.

And that makes us feel smart: we’ve beat the system! Instead of going to the mall and paying $368 for that purse, we hunted through the discount retailer and found it for $148! We worked for it, and we were smart enough to not get conned into buying it at the inflated price. Shell describes research that shows that, in these situations, we feel like we didn’t just save that money, we actually earned it by going to the effort to search out deals. When we buy that $148 purse, we’re likely to leave feeling like we’re somehow $220 richer (since we didn’t pay $368) rather than $148 poorer. And we’ll value it more highly because we feel like we were smart to find it; that is, we’re likely to think a $148 purse bought on “sale” is cooler and better quality than we would the identical purse if we bought it at full price for $120.

And stores capitalize on these psychological tendencies by giving us cues that seem to indicate we’re getting an amazing deal. Sometimes we are. But often we’re being distracted with numbers that seem to give us meaningful information but are largely irrelevant, if not entirely fictional.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Behold, the taken-for-granted, unexceptional:

Cropped version of image by XoMEox, Flickr Creative Commons

Until last week I had never truly thought about shopping carts. I mean, I occasionally notice one stranded in an unexpected place, and as a kid I loved the occasional chance I had to push one a bit and then jump on and race down an aisle. But I read Cheap: The High Cost of Discount Culture by Ellen Ruppel Shell, and it turns out that the story of the shopping cart is fascinating!

Way back in the day, stores weren’t like they were today. You went in and there was a long counter and you had the clerk show you the wares. If you’ve read some Jane Austen or Laura Ingalls Wilder, you’ve undoubtedly come across a scene where a clerk is showing someone bolts of cloth. That’s how things worked: almost everything was behind the counter; you told the clerk what you were interested in and they showed you your options. You haggled over the price, decided on a nice gingham, the clerk wrapped it for you, and off you went. Most retail outlets worked more or less along these lines (think of a butcher, for instance).

But if you were a shop owner interested in keeping prices down, this situation might be less than ideal. It required a lot of clerks, and experienced clerks who knew all the goods and could be trusted to set an acceptably profitable price for them, too.

Eventually retailers, including F.W. Woolworth, tried putting more products out on display in the store so customers could help themselves. Some customers liked the ability to pick items off the shelves directly, but more importantly, you didn’t need as many clerks, and certainly not such highly-paid ones, if their job was mostly reduced to ringing up the purchases at the register.

Of course, this presents a new problem: how are customers going to carry all their purchases around the store while they make their selections? Well, a basket they could carry over an arm would work. But these baskets had a downside: they didn’t hold much and they quickly got heavy.

As Shell notes, in 1937 a man from my home state of Oklahoma, Sylvan Goldman, came up with a solution. He owned the Humpty-Dumpty grocery store chain (I still remember Humpty-Dumpty!). He and a mechanic he hired came up with a cart on which two shopping baskets could be suspended. And thus the shopping cart — or, as Goldman named it, the “folding basket carrier” — was born. As Goldman suspected, people bought more when they didn’t have to carry a heavy basket on their arm. The folding basket carrier was advertised as a solution to the burden of shopping:

10

The only problem was…people didn’t like the new contraptions. From a 1977 interview (via):

I went into our largest store, there wasn’t a soul using a basket carrier, and we had an attractive girl by the entrance that had a basket carrier and two baskets in it, one on the top and one on the bottom, and asked them to please take this cart to do your shopping with. And the housewive’s, most of them decided, “No more carts for me. I have been pushing enough baby carriages. I don’t want to push anymore.” And the men would say, “You mean with my big strong arms I can’t carry a darn little basket like that?” And he wouldn’t touch it. It was a complete flop.

Goldman eventually had to hire attractive models to walk around the store pushing the carts to make shopping carts seem like an acceptable or even fashionable item to use.

Over time the basic design was changed to have a single basket, with a flat shelf on the bottom for large items. The baskets could also then “nest” inside each other (instead of being folded up individually), reducing the amount of space they required for storage.

The Baby Boom ushered in the final major design change, a seat for kids.

Notice in the image above how small the cart was compared to what we’re used to today. I remember as a kid going to the local grocery store, and the carts were quite small; eventually a big warehouse-type grocery store came to the nearest city and their baskets seemed gigantic in comparison. Because obviously, if people will buy more if they have a cart instead of a full arm-carried basket, they’ll buy even more if they have a bigger cart — not just because there’s more room, but because it seems like less stuff if it’s in a bigger cart. Restaurants discovered the same principle — people will want bigger portions if you give them bigger plates because it visually looks like less food and so they don’t feel like they’re over-eating.

Without enormous carts, Big Box discounters and wholesale club stores couldn’t exist. You can’t carry a box of 50 packages of Ramen noodles, 36 rolls of toilet paper, a box of 3 gallons of milk, enough soup for the entire winter, and a DVD player you just found on sale around without a huge cart.

So there you have it: labor de-skilling + marketing – stigma of feminine association + Baby Boom + profits based on increased purchasing of ever-cheaper stuff = the modern shopping cart!

I love it when I learn totally new stuff.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Having a criminal record negatively affects the likelihood of being considered for a job. Devah Pager conducted a matched-pair experiment in which she had male testers apply for the same entry-level jobs advertised in Milwaukee newspapers. She gave the assistants fake credentials that make them equivalent in terms of education, job experience, and so on. Half were Black and half White.

One tester from each pair was instructed to indicate that they had a past non-violent drug possession offense. Pager then collected data on how many of the applicants were called back for an interview after submitting their fake applications.

The results indicate that getting a job with a criminal record is difficult. Having even a non-violent drug offense had a significant impact on rates of callbacks:

Pager

What was surprising was that race actually turned out to be more significant than a criminal background. Notice that employers were more likely to call Whites with a criminal record (17% were offered an interview) than Blacks without a criminal record (14%). And while having a criminal background hurt all applicants’ chances of getting an interview, African Americans with a non-violent offense faced particularly dismal employment prospects. Imagine if the fake criminal offense had been for a property or violent crime?

In addition, according to Pager, employers seemed to expect that Black applicants might have a criminal record:

When people think of Black men they think of a criminal. It affects the way Black men are treated in the labor market. In fact, Black testers in our study were likely to be asked up front if they have a criminal record, while whites were rarely asked…

African American men face a double barrier:  higher rates of incarceration and racial discrimination.

Originally posted in 2009.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.