When it comes to rule-breakers and rule enforcers, which side you are on seems to depend on the rule-breaker and the rule. National Review had a predictable response to the video of a school officer throwing a seated girl to the floor. [Editor’s note: Video added to original. Watch with caution; disturbing imagery]

Most of the response when the video went viral was revulsion. But not at National Review. David French said it clearly:

I keep coming to the same conclusion: This is what happens when a person resists a lawful order from a police officer to move.

The arrested student at Spring Valley High School should have left her seat when her teacher demanded that she leave. She should have left when the administrator made the same demand. She should have left when Fields made his first, polite requests. She had no right to stay. She had no right to end classroom instruction with her defiance. Fields was right to move her, and he did so without hurting her. The fact that the incident didn’t look good on camera doesn’t make his actions wrong.

This has been the general response on the right to nearly all the recently publicized incidents of the police use of force. If law enforcement tells you to do something, and then you don’t do it, it’s OK for the officer to use force, and if you get hurt or killed, it’s your fault for not complying, even if you haven’t committed an offense.

That’s the general response. There are exceptions, notably Cliven Bundy. In case you’d forgotten, Bundy is the Nevada cattle rancher who was basically stealing – using federal lands for grazing his cattle and refusing to pay the fees.  He’d been stiffing the United States this way for many years. When the Federales finally arrested him and rounded up his cattle, a group of his well armed supporters challenged the feds. Rather than do what law enforcers in other publicized accounts do when challenged by someone with a gun – shoot to kill –  the Federal rangers negotiated.

Bundy was clearly breaking the law. Legally, as even his supporters acknowledged, he didn’t have a leg to stand on. So the view from the right must have been that he should do what law enforcement said. But no.

Here is National Review’s Kevin Williamson:

This is best understood not as a legal proceeding but as an act of civil disobedience… As a legal question Mr. Bundy is legless. But that is largely beside the point.

What happened to “This is what happens when a person resists a lawful order”? The law is now “beside the point.” To Williamson, Bundy is a “dissident,” one in the tradition of Ghandi, Thoreau, and fugitive slaves.

Not all dissidents are content to submit to what we, in the Age of Obama, still insist on quaintly calling “the rule of law.”

Every fugitive slave, and every one of the sainted men and women who harbored and enabled them, was a law-breaker, and who can blame them if none was content to submit to what passed for justice among the slavers?

(The equation with fugitive slaves became something of an embarrassment later when Bundy opined that those slaves were better off as slaves than are Black people today who get government subsidies. Needless to say, Bundy did not notice that the very thing he was demanding was a government handout – free grazing on government lands.)

The high school girl refused the teacher’s request that she give up her cell phone and then defied an order from the teacher and an administrator to leave the classroom.  Cliven Bundy’s supporters “threatened government employees and officials, pointed firearms at law enforcement officers, harassed the press, called in bomb scares to local businesses, set up roadblocks on public roads, and formed lists (complete with photos and home addresses) of their perceived enemies” (Forbes).

A Black schoolgirl thrown to the floor by a weightlifting cop twice her size — cop right, rule-breaker wrong. A rural White man with White male supporters threatening Federal law enforcers — cops wrong, rule-breakers right.

Originally posted at Montclair SocioBlog.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

Sometimes the sexy goes too far. These are some of those times.

Sexy pizza rat (Yandy):


Sexy Cecil the Lion (Yandy):


Sexy Donald Trump (Yandy):


Sexy Rosie the Riveter (Party City):


Sexy Frankenstein (Yandy):


Sexy infant (Yandy):

Sexy Charlie Brown (Yandy):


For more Sexy What!?, see our past posts featuring Sexy Chinese Take-OutSexy Yoda, and Sexy Chuckie.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

Flashback Friday.


The image above is a photograph of a snowflake taken in the late 1800s by Wilson Bentley. Bentley, a 19-year-old farmer in Vermont, was the first person to ever photograph snowflakes. From the Guardian:

Bentley’s obsession with snow crystals began when he received a microscope for his 15th birthday. He became spellbound by their beauty, complexity and endless variety.

“Under the microscope, I found that snowflakes were miracles of beauty; and it seemed a shame that this beauty should not be seen and appreciated by others. Every crystal was a masterpiece of design and no one design was ever repeated. When a snowflake melted, that design was forever lost. Just that much beauty was gone, without leaving any record behind,” he said.

Bentley started trying to draw the flakes but the snow melted before he could finish. His parents eventually bought him a camera and he spent two years trying to capture images of the tiny, fleeting crystals.

He caught falling snowflakes by standing in the doorway with a wooden tray as snowstorms passed over. The tray was painted black so he could see the crystals and transfer them delicately onto a glass slide.

To study the snow crystals, Bentley rigged his bellows camera up to the microscope but found he could not reach the controls to bring them into focus. He overcame the problem through the imaginative use of wheels and cord.

Bentley took his first successful photomicrograph of a snow crystal at the age of 19 and went on to capture more than 5,000 more images.

What struck me about this story, other than the pretty pictures and neat historical trivia, was the fact that nearly every schoolchild in the Western world knows what a snowflake looks like under a microscope, even as their experience of snowflakes  is mostly of them as cold, fuzzy, frozen blobs, if they have any regular experience with snow at all.  They know because we teach them.

The idea of the meme is one way to discuss our ability to transfer elusive knowledge like this. A meme is a unit of knowledge or a type of behavior that’s passed on from generation to generation culturally. The gene is its evolutionary cousin, passing along knowledge and behavior genetically.  In the US, this particular knowledge meme is found in books or scientific discussions, but it has also become a common arts and craft project: many of us learn about snowflakes when we are shown how to make them from construction paper:

It’s quite amazing to consider how every human generation since Bentley understands the snowflake just a little bit differently than anyone before him.  Because of the advantage that human culture gives each new generation, nearly every child learns to appreciates their beauty.


See a slide show of his photographs at The Telegraph. This post originally appeared in 2010.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

Who believes that the climate is changing? Researchers at Yale’s Project on Climate Change Communication asked 13,000 people and they found some pretty interesting stuff. First, they found that there was a great deal of disagreement, identifying six types:

  • The Alarmed (18%) – believe climate change is happening, have already changed their behavior, and are ready to get out there and try to save the world
  • The Concerned (33%) – believe it’s happening, but think it’s far off or isn’t going to affect them personally
  • The Cautious (19%) – aren’t sure if it’s happening or not and are also unsure whether it’s human caused
  • The Disengaged (12%) –  have heard the phrase “climate change,” but couldn’t tell you the first thing about it
  • The Doubtful (11%) – are skeptical that it’s happening and, if it is, they don’t think it’s a problem and don’t think it’s human caused
  • The Dismissive (7%) – do not believe in it, think it’s a hoax

As you might imagine, attitudes about climate change vary significantly by state and county. You can see all the data at their interactive map. Here are some of the findings I thought were interesting.

More Americans think that climate change is happening (left) than think it’s human caused (right); bluer = more skeptical, redder = more believing:


Even among people who say that they personally believe in climate change (left, same as above), there are many who think that there is no scientific consensus (right) suggesting that the campaign to misrepresent scientific opinion by covering “both sides” was successful:


People are somewhat worried about climate change (left), but very, very few think that it’s going to harm them personally (right):


Even though people are lukewarm on whether it’s happening, whether it’s human-caused, and whether it’s going to do any harm, there’s a lot of support for doing something about it. Support for regulating CO2 (left) and support for funding research on renewable energy (right):


Take a closer look yourself and explore more questions at the map or read more at the Scholars Strategy Network. And thanks to the people at Yale funding and doing this important work.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

16Prior to the 1850s, writes cultural studies scholar Matthew Brower, men in America didn’t hunt. More specifically, they didn’t hunt for leisure. There was a hunting industry that employed professionals who hunted as a full time occupation, and there was a large market for wild animal products, but hunting for fun wasn’t a common pastime.

This changed in the second half of the 1800s. Americans were increasingly living in cities and being “citified.” Commenters worried that urban life was making men effeminate, effete, overly civilized, domesticated if you will. Cities were a threat to manliness and nature the salve.

Hunting trophies, taxidermied remains of wild animals, served as symbolic proof of one’s “hardiness.” Unlike the animal parts bought at market — whether for food or furs, as feathers on hats, or the then-popular elk tooth watch chain — animals a man killed himself reflected on his skill and character.

As Theodore Roosevelt once put it:

Nothing adds more to a hall or a room than fine antlers when their owner has been shot by the hunter-displayer, but always there is an element of the absurd in a room furnished with trophies of the chase that the displayer has acquired by purchase.

New, elite recreational hunters castigated both lesser men, who purchased animal parts for display, and women who bought them purely for fashion.

This was the origin of the idea that hunting is a contest, as opposed to an occupation or necessity. To paraphrase Brower, a trophy can’t be bought, it must be earned. Thus, the notion of “sportsmanship” as applied to the hunt. If a kill is going to indicate skill, then the hunted must have a “sporting chance.” Thus, recreational hunters developed an etiquette for sportsmanlike hunting, spread through new hunting magazines and periodicals.

Not only did this allow men to claim manly cred, it allowed wealthy men to claim class cred. Brower writes:

Both subsistence and market hunters, the majority of hunters, were placed outside the purview of the sportsman’s code. Those who hunted out of necessity or for profit never could obtain the aesthetic detachment necessary to be considered sportsmen.

In fact, wealthy recreational hunters claimed that only they were “real hunters” and even organized against people who hunted for food and money. For example,

[Roosevelt himself] blamed the decline of game on market hunters, who he argued, had “no excuse of any kind” for the wanton slaughter of animals.

Trophy hunters successfully enacted statutes limiting other types of hunting, so as to preserve game for themselves.

The rarer and larger the animal, the more exquisite the specimen, and the more a man has killed, the better the animals speak to a his manliness and his elite economic and social class. This is perhaps the attraction of international trophy hunting today: the seeking of more exotic and elusive game to bring home and display. And it is perhaps why some people pay $50,000 to travel across the world, kill a lion, cut off its head, then post it on Facebook.

Photo from Wikimedia Commons.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

Flashback Friday.

The AP has an interesting website about wildfires from 2002 to 2006. Each year, most wildfires occurred west of the Continental Divide:

Many of these areas are forested. Others are desert or shortgrass prairie:

There are a lot of reasons for wildfires–climate and ecology, periodic droughts, humans. The U.S. Fish and Wildlife Service reports that in the Havasu National Wildlife Refuge, the “vast majority” of wildfires are due to human activity. Many scientists expect climate change to increase wildfires.

Many wildfires affect land managed by the Bureau of Land Management. For most of the 1900s, the BLM had a policy of total fire suppression to protect valuable timber and private property.

Occasional burns were part of forest ecology. Fires came through, burning forest litter relatively quickly, then moving on or dying out. Healthy taller trees were generally unaffected; their branches were often out of the reach of flames and bark provided protection. Usually the fire moved on before trees had ignited. And some types of seeds required exposure to a fire to sprout.

Complete fire suppression allowed leaves, pine needles, brush, fallen branches, etc., to build up. Wildfires then became more intense and destructive: they were hotter, flames reached higher, and thicker layers of forest litter meant the fire lingered longer.

As a result, an uncontrolled wildfire was often more destructive. Trees were more likely to burn or to smolder and reignite a fire several days later. Hotter fires with higher flames are more dangerous to fight, and can also more easily jump naturally-occurring or artificial firebreaks. They may burn a larger area than they would otherwise, and thus do more of the damage that total fire suppression policies were supposed to prevent.

In the last few decades the BLM has recognized the importance of occasional fires in forest ecology. Fires are no longer seen as inherently bad. In some areas “controlled burns” are set to burn up some of the dry underbrush and mimic the effects of naturally-occurring fires.

But it’s not easy to undo decades of fire suppression. A controlled burn sometimes turns out to be hard to control, especially with such a buildup of forest litter. Property owners often oppose controlled burns because they fear the possibility of one getting out of hand. So the policy of fire suppression has in many ways backed forest managers into a corner: it led to changes in forests that make it difficult to change course now, even though doing so might reduce the destructive effects of wildfires when they do occur.

Given this, I’m always interested when wildfires are described as “natural disasters.” What makes something a natural disaster? The term implies a destructive situation that is not human-caused but rather emerges from “the environment.” As the case of wildfires shows, the situation is often more complex than this, because what appear to be “natural” processes are often affected by humans… and because we are, of course, part of the environment, despite the tendency to think of human societies and “nature” as separate entities.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Back when I was in high school and college, I learned that one of the major things that separated humans from other species was culture. The ability to develop distinct ways of living that include an understanding of symbols, language, and customs unique to the group was a specifically human trait.

And, ok, so it turned out that other species had more complex communication systems than we thought they did, but still, other animals were assumed to behave according to instinct, not community-specific cultures.

But as with so many things humans have been convinced we alone possess, it’s turning out that other species have cultures, too. One of the clearest examples is the division of orcas into two groups with distinct customs and eating habits; one eats mammals while the other is pescetarian, eating only fish. Though the two groups regularly come in contact with each other in the wild, they do not choose to intermingle or mate with one another. Here’s a video:


Aside from the obvious implications for our understanding of culture, this brings up an issue in terms of conservation. Take the case of orcas. Some are suggesting that they should be on the endangered species list because the population has declined. What do we do if it turns out at some point that, while the overall orca population is not fully endangered, one of the distinct orca cultural groups is? Is it enough that killer whales still exist, or do we need to think of the cultures separately and try to preserve sufficient numbers of each? In addition to being culturally different, they are functionally non-interchangeable: each group has a different effect on food chains and ecosystems.

Should conservation efforts address not just keeping the overall population alive and functioning, but ensure that the range of cultural diversity within a species is protected? If this situation occurred, should we declare one orca culture as endangered but not the other? Are both ecological niches important?

I love these questions. If we recognize that creatures can have cultures, it challenges our sense of self, but also brings significantly more complexity to the idea of wildlife preservation.

Originally posted in 2010.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Flashback Friday.

Beautiful:2 (1)




In the classic book, Purity and Danger (1966), Mary Douglas points to the social construction of dirt. She writes:

There is no such thing as absolute dirt: it exists in the eye of the beholder.

If dirt and dirtiness is socially constructed, what do things we identify as dirt, filth, rubbish, and refuse have in common?

Douglas suggests that dirt is really a matter of disorganization. Literally, that a thing becomes dirt or garbage when it is out-of-place. “Dirt,” she writes, “offends against order.”

Eliminating it is not a negative movement, but a positive effort to organise the environment.

I chose the images above to try and illustrate this idea. Hair in the drain, like dirt on our hands, is out-of-place. It doesn’t belong there. In both cases, our reaction is disgust. Hair on the head, in contrast, is beautiful and becoming, while dirt outside is life-giving soil and part of the beauty of nature.

Images royalty free from Getty. Originally posted in 2009.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.