culture

The Washington Post has provided an image from the New England Journal of Medicine that illustrates changing causes of death. Comparing the top 10 causes of death in 1900 and 2010 (using data from the Centers for Disease Control and Prevention), we see first that mortality rates have dropped significantly, with deaths from the top 10 causes combined dropping from about 1100/100,000 to about 600/100,000:

And not surprisingly, what we die from has changed, with infectious diseases decreasing and being replaced by so-called lifestyle diseases. Tuberculosis, a scourge in 1900, is no longer a major concern for most people in the U.S. Pneumonia and the flu are still around, but much less deadly than they used to be. On the other hand, heart disease has increased quite a bit, though not nearly as much as cancer.

The NEJM has an interactive graph that lets you look at overall death rates for every decade since 1900, as well as isolate one or more causes. For instance, here’s a graph of mortality rates fro pneumonia and influenza, showing the general decline over time but also the major spike in deaths caused by the 1918 influenza epidemic:

The graphs accompany an article looking at the causes of death described in the pages of NEJM since its founding in 1812; the overview highlights the social context of the medical profession. In 1812, doctors had to consider the implications of a near-miss by a cannonball, teething could apparently kill you, and doctors were concerned with a range of fevers, from bilious to putrid. By 1912, the medical community was explaining disease in terms of microbes, the population had gotten healthier, and an editorial looked forward to a glorious future:

Perhaps in 1993, when all the preventable diseases have been eradicated, when the nature and cure of cancer have been discovered, and when eugenics has superseded evolution in the elimination of the unfit, our successors will look back at these pages with an even greater measure of superiority.

As the article explains, the field of medicine is inextricably connected to larger social processes, which both influence medical practice and can be reinforced by definitions of health and disease:

Disease definitions structure the practice of health care, its reimbursement systems, and our debates about health policies and priorities. These political and economic stakes explain the fierce debates that erupt over the definition of such conditions as chronic fatigue syndrome and Gulf War syndrome. Disease is a deeply social process. Its distribution lays bare society’s structures of wealth and power, and the responses it elicits illuminate strongly held values.

Earlier this month NPR profiled Alex Hernandez, a member of a Mexican third gender.  This prompted me to re-post our discussion of muxes from 2008.  Images of Hernandez, taken by photographer Neil Rivas, are added at the end.

A New York Times article this week briefly profiles muxes, a third “gender” widely accepted in Oaxaca, Mexico.  According to the article, this part of Mexico has retained many of the pre-colonial traditions.  One of these included flexibility around gender and sexual orientation.  From the article:

There, in the indigenous communities around the town of Juchitán, the world is not divided simply into gay and straight. The local Zapotec people have made room for a third category, which they call “muxes” (pronounced MOO-shays) — men who consider themselves women and live in a socially sanctioned netherworld between the two genders.

“Muxe” is a Zapotec word derived from the Spanish “mujer,” or woman; it is reserved for males who, from boyhood, have felt themselves drawn to living as a woman, anticipating roles set out for them by the community.

Not all muxes express their identities the same way. Some dress as women and take hormones to change their bodies. Others favor male clothes. What they share is that the community accepts them; many in it believe that muxes have special intellectual and artistic gifts.

Robin B. pointed us to a slide show at NPR.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

In the Sociology of Gender textbook, I spend a chapter discussing the idea of institutions.  I define the term as “persistent patterns of social interaction aimed at meeting the needs of a society that can’t easily be met by individuals alone.”  These needs  include educating the next generation, providing health care, ensuring safety, and enabling efficient transportation.  These things are done better and more efficiently if we all chip in and put together a system.

What is interesting about institutions from a sociological perspective is that, once they’re in place, it is essentially impossible to opt out.  You can choose not to buy a car, for example, but the government is still going to spend your tax dollars on highway infrastructure.  You can amass as much medical knowledge and experience as you like, but you’ll still be a criminal if you practice medicine without a licence.  You can believe the government is corrupt and stay home on voting day, but Congress is still going to pass legislation to which you will be held accountable.

You get the picture.

In any case, I thought of this when I came across the striking photography of Eric Valli.  Valli seems to specialize in capturing the lives of people living very close to the earth.  In one series, he follows a group of individuals who have decided to live “off the grid.”  That is, they’ve “unplugged” from the social institutions that sustain us.

The photographs reveal people who are committed to being off the grid. It’s no joke.  And, yet, as I scrolled through them, I couldn’t help to notice how many trappings of the rest of the world were part and parcel of their lives (canoes, coats, oil lamps, cooking and eating utensils, halters, firearms, hot sauce, etc).

I’m not questioning, at all, whether or not these people are off the grid. They certainly appear to be.  But it is interesting to notice how much of the grid is still a part of their lives.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at Montclair SocioBlog.

If a person thinks that the media are infiltrating his mind and controlling his thoughts and behavior, we consider him a nutjob, and we recommend professional help and serious meds.  But if a person thinks that the media are infiltrating other people’s minds and affecting their behavior, we call him or her an astute social observer, one eminently qualified to give speeches or write op-eds.

The previous post dwelt on economist Isabel Sawhill’s Washington Post op-ed channeling Dan Quayle, particularly Quayle’s speech asserting that a TV sitcom was wielding a strong effect on people’s decisions — not just decisions like Pepsi vs. Coke, but decisions like whether to have a baby.

That was Quayle, this is now.  Still, our current vice-president can sometimes resemble his counterpart of two decades ago.  Just a last month, Joe Biden echoed the Quayle idea on the power of sitcoms.  On “Meet the Press,” in response to David Gregory’s question about gay marriage, Biden said that “this is evolving” and added:

And by the way, my measure, David, and I take a look at when things really begin to change, is when the social culture changes.  I think “Will and Grace” probably did more to educate the American public than almost anything anybody’s ever done so far.

“Will and Grace” ran for eight seasons, 1998-2006.  Its strongest years were 2001-2005, when it was the top rated show among the 18-49 crowd. Biden could point to General Social Survey (GSS) data on the gay marriage question.  In 1988, ten years before “Will and Grace,” the GSS asked about gay marriage.  Only 12% supported it, 73% opposed it.  The question was asked again in 2004, six years into the W+G era.  Support had more than doubled, and it continued to rise in subsequent years.

We don’t know just when in that 18-year period, 1988-2004, things “really began to change.”  Fortunately, the GSS more regularly asked the respondent’s view on sexual relations between same-sex partners.  Here too, tolerance grows in the “Will and Grace” period (gray on the graph):

The graph is misleading, though. To see the error, all we need do is extend our sampling back a few years  Here is the same graph starting in 1973:

The GSS shows attitudes about homosexuality starting to change in 1990.  By the time of the first episode of “Will and Grace,” the proportion seeing nothing wrong with homosexuality had already doubled.  Like Quayle’s “Murphy Brown” effect, the “Will and Grace” effect is hard to see.

The flaw in the Quayle-Biden method is not in mistaking TV for reality.  It’s in assuming that the public’s awareness is simultaneous with their own.

Why do our vice-presidents (and many other people) give so much credit (or blame) to a popular TV show for a change in public opinion?  The error is partly a simplistic post hoc logic.   “Will and Grace” gave us TV’s first gay principle character; homosexuality became more acceptable.  Murphy Brown was TV’s first happily unwed mother, and in the following years, single motherhood increased.   Besides, we know that these shows are watched by millions of people each week.  So it must be the show that is causing the change.

It’s also possible that our vice-presidents (and many other people) may also have been projecting their own experiences onto the general public.  Maybe Murphy Brown was the first or only unwed mother that Dan Quayle really knew – or at least she was the one he knew best.  It’s possible that Joe Biden wasn’t familiar with any gay men, not in the way we feel we know TV characters.  A straight guy might have some gay acquaintances or co-workers, but it’s the fictional Will Truman whose private life he could see, if only for a half hour every week.

Does TV matter?  When we think about our own decisions, we are much more likely to focus on our experiences and on the pulls and pushes of family, work, and friends.  We generally don’t attribute much causal weight to the sitcoms we watch.  Why then are we so quick to see these shows as having a profound influence on other people’s behavior, especially behavior we don’t like?   Maybe because it’s such an easy game to play.  Is there more unwed motherhood?  Must be “Murphy Brown.”  Did obesity increase in the 1990s?  “Roseanne.”  Are twentysomethings and older delaying marriage?  “Seinfeld” and “Friends.” And of course “The Simpsons,” or at least Bart and Homer, who can be held responsible for a variety of social ills.

Cross-posted at Montclair SocioBlog.

I’m not sure what effect prime-time sitcoms have on the general public.  Very little, I suspect, but I don’t know the literature on the topic.  Still, it’s surprising how many people with a similar lack of knowledge assume that the effect is large and usually for the worse.

Isabel Sawhill, is a serious researcher at Brookings; her areas are poverty and inequality.  Now, in a Washington Post article, she, says that Dan Quayle was right about Murphy Brown.

Some quick history for those who were out of the room — or hadn’t yet entered the room: In 1992, Dan Quayle was vice-president under Bush I.  Murphy Brown was the title character on a popular sitcom then in its fourth season — a divorced TV news anchor played by Candice Bergen.  On the show, she got pregnant.  When the father, her ex, refused to remarry her, she decided to have the baby and raise it on her own.

Dan Quayle, in his second most famous moment,* gave a campaign speech about family values that included this:

Bearing babies irresponsibly is simply wrong… Failing to support children one has fathered is wrong… It doesn’t help matters when prime-time TV has Murphy Brown, a character who supposedly epitomizes today’s intelligent, highly paid professional woman, mocking the importance of fathers by bearing a child alone and calling it just another lifestyle choice.

Sawhill, citing her own research and that of others, argues that Quayle was right about families:  children raised by married parents are better off in many ways — health, education, income, and other measures of well-being — than are children raised by unmarried parents whether single or together.**

But Sawhill also says that Quayle was right about the more famous part of the statement – that “Murphy Brown” was partly to blame for the rise in nonmarried parenthood.

Dan Quayle was right. Unless the media, parents and other influential leaders celebrate marriage as the best environment for raising children, the new trend — bringing up baby alone — may be irreversible.

Sawhill, following Quayle, gives pride of place to the media.  But unfortunately, she cites no evidence on the effects of sitcoms or the media in general on unwed parenthood.  I did, however, find a graph of trends in unwed motherhood. It shows the percent of all babies that were born to unmarried mothers.  I have added a vertical line to indicate the Murphy Brown moment.

The “Murphy Brown” effect is, at the very least, hard to detect. The rise is general across all racial groups, including those who were probably not watching a sitcom whose characters were all white and well-off.  Also, the trend begins well before “Murphy Brown” ever saw the light of prime time.  So 1992, with Murphy Brown’s fateful decision, was no more a turning point than was 1986, for example, a year when the two top TV shows were “The Cosby Show” and “Family Ties,” sitcoms with a very low rate of single parenthood and, at least for “Cosby,” a more inclusive demographic.

————————

  * Quayle’s most remembered moment: when a schoolboy wrote “potato” on the blackboard, Quayle “corrected” him by getting him to add a final “e” – “potatoe.”  “There you go,” said the vice-president of the United States approvingly. (A 15-second video is here.)

** These results are not surprising.  Compared with other wealthy countries, the US does less to support poor children and families or to ease the deleterious effects on children who have been so foolhardy as to choose poor, unmarried parents.

This is the second of two posts about cruel practices in horse industries. The first was about horse racing.

Yesterday we covered the abuse of horses in horse racing; in this post we discuss a recent video released by the Humane Society. The video highlights an instance of a larger issue, which is how arbitrary human tastes can create incentives for cruelty.

The concern revolves around the Tennessee Walking Horse (TWH), a breed developed in the U.S. in the late 1800s and bred to have smooth gaits, including their distinctive “running walk,”  that are unusual in most breeds. Over time, a more exaggerated version became popular among show judges and spectators at TWH shows; called the “big lick,” it requires horses to shift their weight to their back legs and pick their front legs high off the ground. Fans enjoyed the flashy, unusual movements and horses that performed the gait began taking home more prizes. This created a powerful incentive to get horses to exhibit the unnaturally exaggerated gait.

How do you get this gait? It’s possible to get some horses to do so through careful training. But to speed up the process, or for horses that aren’t learning, trainers developed a range of techniques. These first two are still allowed, under varying circumstances, during training and in the show ring:

  • Using padding and weighted shoes to change how the horse stands and moves its feet (akin to how high heels shift a person’s weight and stance).
  • Placing chains around the tops of their hooves to encourage them to pick their feet up more highly than they would otherwise (presumably they’re irritating).
However, some trainers use prohibited versions of these two items, using pads and chains that were not within the allowable height and weight.
The next three techniques are illegal, but many insiders argue that they are still common.  I warn you now: much of this post from this point on will be very upsetting for many readers.
  • Place screws or nails in different parts of their front hooves or soles to cause discomfort.  While horses’ hooves are hard, the soles are quite sensitive.  The screws or nails make it painful for the horse to put its front legs down, so it shifts its weight back, helping to attain the gait.
  • Intentionally cut the horse’s front hooves so short that the sensitive sole hits the ground directly, which is extremely painful (think of what happens if your fingernail gets cut or broken off too short).
  • Coating a horse’s hooves and lower legs with caustic substances, then wrapping them in plastic wrap, for as long as several days, until they’re very sore — a process called, aptly, “soring.” This causes the horse to shift weight to its back legs in an effort to reduce the pain from the front feet. This is often used in conjunction with chains, which irritate and rub up against the raw skin.

Many inspectors argue that these practices, once widely accepted in the industry, are still common today. Recently the Humane Society released undercover footage of training practices at Whitter Stables, a facility in Collierville, TN that has been the center of a federal investigation. It is a very distressing video that includes many of the practices listed above, as well as horses being whipped when they have difficulty standing:

In 1970, Congress passed the Horse Protection Act, which outlawed the exhibition of sored horses. So trainers have developed techniques to hide them; they paint horses’ hooves and legs to cover evidence of soring or use boots to cover tacks embedded in their hooves.

They also beat them so that they learn not to show any sign of pain when inspected before a show.  They do this by simulating an inspection and then punishing the horse if it shows any signs of distress (e.g., punching or hitting them in the face or administering an electric shock).  Eventually horses learn that if they flinch, they get hurt twice; hiding signs of pain prevents the infliction of more suffering.

Trainers may also use a fast-acting but short-term numbing agent to reduce the pain just long enough to pass inspections. Other trainers and owners simply leave a show if word gets out that USDA inspectors were present.

The Tennessee Walking Horse Breeders’ and Exhibitors’ Association argues that these practices are not widespread. However, in 2006 the last class in the World Grand Champion competition at the Tennessee Walking Horse Celebration (the TWH show equivalent of the Kentucky Derby, in terms of importance) was canceled because of the 10 entered horses, 5 did not pass the inspection and another was removed by the owner without being inspected. In 2009, the USDA issued over 400 violations at the Celebration.

A USDA report states the organization only had the resources to send their own veterinarians to 6% of official TWH shows in 2007; the other 94% were inspected by individuals hired, trained, and licensed by organizations sponsoring shows, a system the USDA found to be plagued by conflicts of interest. The report also noted that hostility toward USDA inspectors is so high that they routinely bring police or armed security with them to shows.

Jackie McConnell, the trainer in the video, has been indicted on federal charges. But without sustained attention and commitment to punishing violators, the problem will continue due to the pressure to produce horses that satisfy the tastes that have become entrenched in the industry. As one industry insider explained to Horse Illustrated magazine in 2004,

As long as the big lick wins at shows, the trainer must produce it to stay in business….The day a trainer stops producing big lick horses is the day all the horses in his or her barn are removed and taken to another trainer.  The pressure is enormous.

—————————

Gwen Sharp and Lisa Wade are professors of sociology. You can follow Gwen on Twitter and Lisa on Twitter and Facebook.

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.

Sitting through Disney’s Tangled again, I saw new layers of gender in there. They’ve moved beyond the old-fashioned problem of passive princesses and active princes, so Rapunzel has plenty of action sequences. And it’s not all about falling in love (at least at first). Fine.

But how about sexual dimorphism? In bathroom icons the tendency to differentiate male and female bodies is obvious. In anthropomorphized animal stories its a convenient fiction. But in social science it’s a hazardous concept that reduces social processes to an imagined biological essence.

In Tangled, the hero and heroine are apparently the more human characters, whose love story unfolds amidst a cast of exaggerated cartoons, including many giant ghoulish men (the billed cast includes the voices of 12 men and three women).

Making the main characters more normally-human looking (normal in the statistical sense) is a nice way of encouraging children to imagine themselves surrounded by a magical wonderland, which has a long tradition in children’s literature: from Alice in Wonderland to Where the Wild Things Are.

That’s what I was thinking. But then they went in for the lovey-dovey closeup toward the end, and I had to pause the video:

Their total relative size is pretty normal, with him a few inches taller. But look at their eyes: Hers are at least twice as big. And look at their hands and arms: his are more than twice as wide. Look closer at their hands:

Now she is a tiny child and he is a gentle giant. In fact, his wrist appears to be almost as wide as her waist (although it is a little closer to the viewer).

In short, what looks like normal humanity – anchoring fantasy in a cocoon of reality – contains its own fantastical exaggeration.

The patriarchal norm of bigger, stronger men paired up with smaller, weaker women, is a staple of royalty myth-making – which is its own modern fantasy-within-reality creation. (Diana was actually taller than Charles, at least when she wore heels .)

In this, Tangled is subtler than the old Disney, but it seems no less powerful.

Philip N. Cohen is a professor of sociology at the University of Maryland, College Park, and writes the blog Family Inequality. You can follow him on Twitter or Facebook.

James Mollison, the photographer who brought us Where Children Sleep, has a fantastic series called The Disciples in which he captures die-hard music fans (he calls them “tribes”).  The results are a great example of the power of sub-culture.

 

Mollison photographed fans of Madonna, Iron Maiden, Kiss, Dolly Parton, 50 Cent, The Casualties, and many more. You should go check out them all..

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.