culture

Squee sent in this video on the complexities of the placebo effect. We most often hear about the placebo effect in terms of medicine (the famed “sugar pill” that makes people feel better despite having no known effect on a condition), but as the video points out, we use placebos in other aspects of social life as well, such as buttons at intersections that don’t affect the timing of the “walk” signal but make pedestrians feel better about their wait anyway. And since the placebo effect is based in part on cultural assumptions about what should make us feel better (i.e., an expensive drug must be better than a discounted one, right?), not surprisingly the effectiveness of specific placebos varies cross-culturally.

Fun!

In a comments thread, shorelines linked to a fascinating Scientific American article about adolescence by psychologist Robert Epstein. In it, he points to the invention of the very idea of adolescence and its non-universality. In a sample of 186 pre-industrial societies, for example, only 60% had words for the life stage and most had little or no problems with anti-social teen behavior. This data, however, contrasts strongly with new research suggesting that adolescent brains are quite different from adult brains.

How do we make sense of this?

Epstein suggests that differences in brain structure may be the result of social realities, not their cause. He writes:

I have not been able to find even a single study that establishes a causal relation between the properties of the brain being examined and the problems we see in teens… [Meanwhile, c]onsiderable research shows that a person’s emotions and behavior continuously change brain anatomy and physiology… So if teens are in turmoil, we will necessarily find some corresponding chemical, electrical or anatomical properties in the brain. But did the brain cause the turmoil, or did the turmoil alter the brain? Or did some other factors—such as the way our culture treats its teens—cause both the turmoil and the corresponding brain properties.

By “the way our culture treats its teens,” Epstein is referring to the possibility that we infantilize and criminalize them. He includes a figure illustrating how we’ve increasingly targeted teens with laws:

Teens are subject to, Epstein explains, “…more than 10 times as many restrictions as are mainstream adults, twice as many restrictions as active-duty U.S. Marines, and even twice as many restrictions as incarcerated felons.”

Believing them to be different from adults, we then segregate them:

Today, with teens trapped in the frivolous world of peer culture, they learn virtually everything they know from one another rather than from the people they are about to become. Isolated from adults and wrongly treated like children, it is no wonder that some teens behave, by adult standards, recklessly or irresponsibly.

Epstein has no more data showing that how we treat teens, and how they learn to behave, changes their brain anatomy and physiology, than he does showing the reverse. But the former certainly has substantial neurological precedent. Meanwhile, the latter is comforting to a society awash in out-of-control adolescence: “What is there to do? It’s only natural.” Right?

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

In and around the apartment complex where we live in Nanning, China, there are no less than five coffee shops.  They are part of what make our neighborhood so much, well, like a neighborhood.  Several of them have free wi-fi and they all have finely crafted, good quality kafei (coffee).

But my wife and I have learned the hard way that if we want coffee in the morning, we either have to go to McDonald’s or make it ourselves.  While your average US Starbucks employee arrives at work before the sun peeks its head above the horizon, baristas in China report to work around 10 am.  And while some Starbucks and rare other coffee shops in the US are open until ten or even midnight (at the latest), their Chinese counterparts stay open until two in the morning every night.

Needless to say we found these business hours confounding, and poked around to find out why anyone would want to drink such strong coffee (and, do not doubt, this coffee is stout!) so late at night.

As it turns out, China’s coffee history dates back to the early 19th century, but in all those years, coffee never “caught on.”  And it is not really a mystery as to why.  China’s tea culture has a centuries-long monopoly on China’s liquid ingestion.  Coffee?  Well, its OK, if you like that sort of thing.

But if for 200 years the Chinese have resisted coffee, why now are coffee shops finding enough success that there is room for five in one small neighborhood?  The answer is in the picture of my wife below.   Chinese cafes are dimly-lit, quiet, and “romantic” (or at least that is the goal of the decor) rendezvous points.  A new high school couple might take their xiuxi (afternoon rest – the Chinese version of a siesta), flirting with each other while sipping on lattes.  After a night out on the town, young couples flood the cafes, taking lots of pictures, drinking beer and maybe a couple of iced macchiatos.

The marketing scheme is actually quite impressive.  If you can’t win them over with quality, lovingly brewed, pristinely presented coffee, make the coffee shop a romantic oasis.  Draw them in with the promise of passion and mystery and win them over with brilliantly executed coffee.

The pictures are just a bonus.

Evan Schneider is a graduate of Princeton Theological Seminary. He teaches English at an education college in Nanning, China.  While in China he enjoys learning to cook Chinese food and discovering the differences between the way that Chinese and American people think about food.  He blogs about it at Cooking Chinese.

 


What is a norm?  How important is it that we follow them?  And what happens when we break one?

Nathan Palmer, a lecturer at Georgia Southern University and founder of the blog Sociology Source, recruited his entire class of 262 students to go into the world and do nothing (an idea he borrowed from Karen Bettez Halnon).  It was sort of like a flash mob in which absolutely nothing happens.

Palmer’s aim was to reveal a norm (in this case, that we all must always be doing something), expose his students to the feelings one has when breaking a norm (even a consequence-less one like this), and show them the range of reactions that observers have to norm breaking.  And he recorded the whole thing for us:

Read Palmer’s entire write-up of the experiment at Sociology Source.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

In an earlier post, I discussed growing trends of body modification as illustrative of the new cyborg body. Although it is debatable whether these trends are in fact “new,” (after all, various indigenous cultures have been practicing body modification long before European colonists began taking note of it in their travel diaries), I would like to continue this conversation by looking at one subculture of body modification: tattooing.

As an avid “tattoo collector” myself, I have spent the past few years attending tattoo conventions, hanging out with tattooers, and getting heavily tattooed, all while working on my research regarding the popularization of tattooing. What I notice are changing norms regarding appropriate use of the body as canvas. I would like to draw your attention to one particular trend that is growing in the tattoo subculture: facial tattoos.

What was once the purview only of convicted felons has become an increasingly normative way of expressing one’s commitment to the subculture. (For a case in point, simply Google “facial tattoos” and see what pops up.) What I notice from my interviews and discussions with tattooers and clients alike is a sharp disparity between those who see the face as a legitimate space for artistic display and those who see the face as “off limits.” Traditionally, tattooers were wary of getting tattooed on “public skin” (e.g., face, hands, and neck), as employment in the industry was unpredictable and one never knew if she would need to find another job amongst the masses. Having tattoos on public skin was almost certain to prevent employment. But things may be changing.

As the tattoo industry has become somewhat of a pop culture phenomenon, and many consumers have become visibly tattooed (full sleeves, bodysuits, and the like), tattooers have begun to see the face as a legitimate space for getting tattooed. Many of the artists I have spoken with now have prominent facial tattoos, and the ones that don’t often plan on getting them (usually something small beside one or both eyes). For many, it is now the only way to differentiate themselves from tattoo collectors or other body modification enthusiasts who now sport full body suits, stretched earlobes, and other prominent modifications. Where having full sleeves once denoted one’s status as a tattoo artist – a professional in a tightly-knit and guarded community of craftsmen – now facial tattoos serve to display one’s commitment to the profession, lifestyle, and artform.  As such, facial tattoos have become a new form of (sub-)cultural capital, where those who were on the “inside” of the subculture now find themselves defending their turf from a onslaught of newcomers wanting to jump on the bandwagon.

However, even among tattooers, there is a resistance to this growing trend. Many of the traditional tattooers (those who were trained long ago or received a formal apprenticeship) that I have spent time with spoke strongly against facial tattoos. In fact, most traditional tattooers refuse to tattoo clients on public skin entirely. It is simply not a part of their habitus (to borrow Bourdeiu’s terminology). This was certainly the case for the likes of Bert GrimmCharlie Barrs, or Amund Dietzel, traditional tattooers from the 40s, 50s, and 60s, as well as many other artists who were trained prior to the Tattoo Renaissance of the 70s. For the young man who walks into the tattoo shop and asks for his lover’s name emblazoned across his neck, he now has to find a “friend” in the industry who is willing to do it. Either that or he must risk receiving a poorly-rendered tattoo from a “scratcher” (someone untrained and unaffiliated with a tattoo shop, who purchased their equipment online) who will agree to do the piece at a dramatically reduced rate at a tattoo party or out of somebody’s basement.

In conclusion, as tattooing has become more popular, tattooers and those whose lives revolve around the art of tattooing must create new forms of distinction to differentiate themselves from the masses. This goes back to Bourdieu’s notion of the social field, and how forms of distinction change after the entrance of new social actors with much different forms of capital and very different habituses. Although many a Nu-Skool tattooer (those who recently joined the tattooing profession, often with art school training) sees no problem tattooing their face, hands and neck, many traditional tattooers still see it as a questionable practice and refuse to do it themselves. However, with pressure resulting from the increasing popularity of tattooing and the increasing numbers of individuals with their faces prominently tattooed, we may see an increase in traditional tattooers who choose to tattoo their face.

David Paul Strohecker is getting his PhD in Sociology at the University of Maryland. He studies cultural sociology, theory, and intersectionality. He is currently working on a larger project about the cultural history of the zombie in film.  This post originally appeared at Cyberology.

For more on Bourdieu, see our posts on The Evangelical Habitus, Dumb vs. Smart Books, and The Hipster and the Authenticity of Taste.

Last night I was cold. So cold, in fact, that I had to pull out not one, but two, of my Pendleton blankets to add some extra warmth to my bed. As I shook them out and laid them on my bed, I thought about how special these blankets are to me–one was a graduation gift, the other a thank you gift for serving on a panel about the “Future of Indian Education.” In many Native communities, Pendleton blankets are associated with important events, and have been for hundreds of years. They are given as gifts at graduations, at powwow give-aways, as thank you gifts, in commemoration of births and deaths, you name it. In addition, I’ve always associated the patterns with Native pride — a way for Natives to showcase their heritage in their home decor, coats, purses, etc. There’s something just distinctly Native about Pendleton to me.

Stanford Native Graduation from a couple years ago:

But recently, Pendleton prints and fabrics have started popping up everywhere. It started with Opening Ceremony’s Pendleton line in 2010, and now Urban Outfitters has started carrying a Pendelton linecelebrities are wearing Pendleton coats, and Native-themed home decor is apparently all the rage.  Now Pendleton has announced their newest collaboration, The Portland Collection, which fashion blogs are proclaiming will be the big thing for 2011.

So what’s the problem? I openly admit that a lot of these designs are adorable, and I would fully sport them (that bag! I love!), if I had a spare $1000 or so. I can’t cry straight up cultural appropriation, because…well, it’s complicated.

Pendelton has been supplying Natives with blankets and robes with Indian designs since the late 1800’s, which the “history” section of their website outlines:

A study of the color and design preferences of local and Southwest Native Americans resulted in vivid colors and intricate patterns. Trade expanded from the Nez Perce nation near Pendleton to the Navajo, Hopi and Zuni nations. These Pendleton blankets were used as basic wearing apparel and as a standard of value for trading and credit among Native Americans. The blankets also became prized for ceremonial use.

It’s almost a symbiotic relationship — they saw a market in Native communities, and Native communities stepped up and bought, traded, and sold the blankets, incorporating them into “traditional” cultural activities. Pendleton has also maintained close ties with Native communities and causes, making commemorative blankets for organizations like the National Museum of the American Indian and the National Indian Education Association. They work with Native artists to design the special edition blankets, and even donate some of the proceeds to the causes.

(NIEA 40th anniversary blanket)

But then, on the other hand, they go off and do things like design a $5000 blanket with White Buffalo hair, which many tribes consider extremely sacred and definitely off-limits to commercial sale.

I do appreciate Pendleton’s relationship with Native communities. I love my blankets, and love even more what they represent.

However, seeing hipsters march down the street in Pendleton clothes, seeing these bloggers ooh and ahh over how “cute” these designs are, and seeing non-Native models all wrapped up in Pendleton blankets makes me upset. It’s a complicated feeling, because I feel ownership over these designs as a Native person, but on a rational level I realize that they aren’t necessarily ours to claim. To me, it just feels like one more thing non-Natives can take from us — like our land, our moccasins, our headdresses, our beading, our religions, our names, our cultures weren’t enough? you gotta go and take Pendleton designs too?

Then there’s the whole economic stratification issue of it too, these designs are expensive. The new Portland collection ranges from $48 for a tie to over $700 for a coat, the Opening Ceremony collection was equally, if not more, costly. It almost feels like rubbing salt in the wound, when poverty is rampant in many Native communities, to say “oh we designed this collection based on your culture, but you can’t even afford it!”

So I don’t know. Are all of these designs cultural appropriation? Should I ignore the twinge in my stomach every time I see a Pendleton pattern in the Urban Outfitters window? Should I embrace it as the mainstream fashion scene finally catching up with what we Natives have known since the 1800’s?

Personally, the bottom line is that I would rather associate Pendleton with Native pride and commemorating important events…
(our panel last year)

…than with hipsters, high fashion, and flash-in-the-pan trends. But I’m obviously conflicted. What do you think? Are these designs and trends ok, or do I have a right to be upset?

(Thanks to Precious for getting me thinking about this!)
Adrienne K. is a member of the Cherokee Nation of Oklahoma and a graduate student in Boston, where she studies access to higher education for Native students. In her free time, she blogs about cultural appropriation and use of Indigenous cultures, traditions, languages, and images in popular culture, advertising, and everyday life at Native Appropriations.

If you are alive these days, and not already part of the undead masses yourself, you probably have noticed a staggering increase of zombie references in film, television, pop culture, videogames and the internet.

For instance, the big screen and small screen have both hosted a plethora of zombie films, e.g., 28 Days Later (2002), Shaun of the Dead (2004), and I Am Legend (2007). On television, we have seen the recent success of AMC’s The Walking Dead. And if you are on a college campus, you have probably seen undergraduates playing “Zombies Vs. Humans,” a game of tag in which “human” players must defend against the horde of “zombie” players by “stunning” them with Nerf weapons and tube socks. In videogames, we have seen the success of the Resident Evil franchise, Left 4 Dead, and Dead Rising. Finally, the internet is awash with zombie culture. From viral videos of penitentiary inmates dancing to Michael Jackson’s “Thriller,” to post-apocalyptic zombie societies, fansites, and blogs.

But what is the zombie and where does it come from?

What makes the zombie unique from other movie monsters is its unique place of origin. Whereas Frankenstein, Dracula, and the Wolfman all have ties to the Gothic literary tradition, the zombie stands apart in having a relatively recent (and proximal) origin. Theorists of zombie culture (such as Kyle Bishop or Jamie Russell), attribute the origin of the zombie to Haitian folklore and the hybrid religion of voodoo. But the zombie didn’t make its away into American culture until the 1920s and 30s, when sensationalist travel narratives were popular with Western readers. Specifically, W.B. Seabrook’s book The Magic Island, is often credited as the first popular text to describe the Haitian zombie. Additionally, the work of Zora Neale Hurston (specifically her 1937 book Tell My Horse) explores the folklore surrounding the zombie in Haitian mythology.

(Still from I Walked with a Zombie, 1943)

With the development of the motion picture, the zombie became a staple of horror, and a popular movie monster. The zombies of White Zombie (1932), Revolt of the Zombies (1936), King of the Zombies (1941), and I Walked with a Zombie (1943), however, were not the cannibalistic creatures we now know. These zombies were people put under a spell, the spell of voodoo and mystical tradition. In these films, the true terror is not be being killed by zombies, but of becoming a zombie oneself.

Bela Lugosi as ‘Murder’ Legendre, the mad scientist and his zombie slave:

 

What all these films have in common is their depiction of Voodoo and Haitian culture more generally as dangerous, menacing, and superstitious. Those who study colonial history note that the messages contained in these films present stereotyped versions of Haitian culture aimed largely at satisfying a predominantly white audience. Many of these films also contain an all white cast, with several members in blackface serving as comedic relief for the more “serious” scenes.

It’s interesting to see how the zombie has morphed into the cannibalistic creatures we now know. While the original zombie is a powerful metaphor for fears of the non-white Other and reverse colonization, the contemporary zombie largely reflects contemporary fears of loss of individuality, the excesses of consumer capitalism, environmental degradation, the excesses of science and technology, and fears of global terrorism (especially more recent renditions of the zombie post-9/11).

For instance, George A. Romero’s famous Night of the Living Dead (1968), the first film to feature the flesh-eating zombie, is often remarked as a not-so-subtle allegory to the Civil Rights Era and the militant violence perpetuated by Southern states against the Black protestors, as well as a critique of the Vietnam War. Romero himself has stated that he wanted to draw attention to the war through the images of violence contained in the film.

Cannibal zombies in Night of the Living Dead (1968):

Similarly, the Italian zombie horror film Let Sleeping Corpses Lie (1974) reflects fears of environmental degradation and pollution. In this film, the zombie epidemic is caused by an experimental pest-control machine, which sends radio waves into the ground. Although it solves the local pest problem for farmers, it also reanimates the dead in a nearby cemetery.

Zombie consumers in Romero’s second zombie flick Dawn of the Dead (1978):

Later zombies are used to symbolize the excesses of capitalism and militarism, respectively.  For example, in 28 Weeks Later (2007), we see the decay of social structures across the globe, as institutions that are supposed to protect us inevitably fail to do their job.  In this scene, protagonists attempt to escape the city just before the military firebombs it:

As we can see, the zombie has a unique cultural history and serves as a powerful metaphor for social anxieties. This movie monster might have come out of the Caribbean, but it became a powerful representation of modern fears when it met the silver screen. Perhaps the current failure of global social structures (global terrorism, environmental catastrophes, and the current economic downturn) has prompted the most recent “Zombie Renaissance.” Or maybe we are just gluttons for the “everyman” tales contained in each rendition of the zombie apocalypse, a point made by SocProf several months back. I do not know what the future holds, but one thing is certain: the zombie will continue to haunt us from beyond the grave.

David Paul Strohecker is getting his PhD in Sociology at the University of Maryland. He studies cultural sociology, theory, and intersectionality. He is currently working on a larger project about the cultural history of the zombie in film.


This 1942 ad for Lifebuoy soap is a great example of shifts in collective cultural awareness of homosexuality. From a contemporary U.S. perspective, where most of us have heard homophobic jokes about not dropping the soap in the shower, two men showering together (even or especially in a military context) and using language like “hard” and “get yourself in a lather” is undeniably a humorous reference to gay men.

I think, however, that this was not at all the intention in 1942, where the possibility of men’s sexual attraction to other men wasn’t so prominent of a cultural trope.  It simply wasn’t on people’s minds as it is today.  Accordingly, the ad seems to be a simple illustrated recommendation, complete with a nice heterosexual prize at the end.

From Vintage Ads.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.