health

Photo by Jan Siefert via flickr
Photo by Jan Siefert via flickr

Some experience discrimination throughout their lives, while, for others, it’s simply living long enough that leads to discrimination. According to research from Clemson University sociologist Ye Luo and her team that’s reported in The New York TimesNew Old Age blog, nearly two thirds of those over age 53 report having been discriminated against—and the leading cause they report isn’t gender, race, or disability. It’s age.

Now, on its own, this statistic isn’t terribly surprising—many studies have turned up high levels of ageism. But Luo told the Times she was shocked that, over the two-year period of their study, everyday discrimination was found to be associated with higher levels of depression and worse self-reported health. The association held true even as the researchers controlled for general stress resulting from financial problems, illness, and traumatic events. As the Times reports:

Interestingly, the discrimination effect was stronger for everyday slights and suspicions (including whether people felt harassed or threatened, or whether they felt others were afraid of them) than for more dramatic evens like being denied a job or promotion or being unfairly detained or questioned by the police. “Awful things happen and it’s a big shock, but people have ways to resist that damage,” Dr. Luo said. “With maturity, people learn coping skills.” Every day discrimination works differently, apparently. “It may be more difficult to avoid or adapt to,” Dr. Luo suggested. “It takes a toll you may not even realize.”

Although trends may shift as more data comes into focus, it’s already clear that the well-being of older adults is being affected when they experience ageism in their social interactions.

Photo from Seattle Municipal Archives via flickr

Talk about a chronic condition! According to new research from the European Journal of Public Health, higher rates of poor health among women aren’t just the result of reporting bias, but higher actual rates of chronic health problems. MSNBC.com’s “Vitals” section (via MyHealthNews Daily) covers the research, which included interviews and medical records data from over 29,000 Spaniards, and reports:

…when the researchers matched up the number of chronic conditions each person had with his or her health rating, the gender difference disappeared. Having a higher number of chronic conditions correlated with poorer self-rated health to the same degree in both genders.

For men and women with the same conditions, or the same number of conditions, women were no more likely to claim poorer health.

To put these numbers into some context, reporter Sarah C.P. Williams sought out British sociologist Ellen Annandale, who studies the connections between gender and health. Dr. Annandale confirmed the long-standing notion that women simply communicate better and more often with their doctors, but don’t actually experience worse health outcomes than men—but said this new research upends that idea and offers clues to better medical treatment for people of all genders:

“Gender influences that way that people are treated and diagnosed in health systems,” Annandale said. “It influences the kind of health conditions that men and women suffer from, the way people relate to their own bodies, and what kind of access to health care they have.”

Understanding gender differences in health can help scientists and doctors find ways to better treat patients, she said.

“Women generally live longer than men, but in many countries that gap in life expectancy has been decreasing over time. One of the reasons for that is thought to be that men’s health is improving, but women’s is not.”

In an interview discussing whether teen sleepovers can actually prevent teen pregnancy, CNN’s Ali Velshi says flatly, “This is a little bit counter-intuitive.” But as his interviewee, UMass sociologist Amy Schalet (who wrote on this subject in Contexts in “Sex, Love, and Autonomy in the Teenage Sleepover” in the Summer of 2010), explains, “Let me clarify: it’s not a situation where everything goes… It’s definitely older teenage couples who have established relationships and whose parents have talked about contraception.” Which is to say, as Velshi puts it, sex and sex education in countries like the Netherlands, in which parents are more permissive—or as Schalet says, “parents are more connected with their kids”—about allowing boyfriends and girlfriends to sleep over, take “a holistic approach.”

Schalet’s research, explored more deeply in her new University of Chicago book Not Under My Roof, takes a look at American parenting practices surrounding teen sex and the practices of parents in other countries. Using in-depth interviews with parents and teens and a host of other data, she finds:

The takeaway for American parents… isn’t necessarily “You must permit sleepovers.” Many parents are going to say, “Not under my roof!” That’s why it’s the title of my book. The takeaway is that you can have more open conversations—you should probably have more open conversations—about what’s a good relationship, sex and contraception should go together, what does it mean to be “ready,” how to get rid of some of these damaging stereotypes (gender stereotypes). Those are all things that are going to help promote teenage health and better relationships between parents and kids.

Schalet is clear that parental approaches are nowhere near the only factor in the stark differences in teen pregnancy rates between the U.S. and the Netherlands, but says they are, in fact, particularly important. “Kids are having sex, clearly,” Velshi says. And that’s precisely the point, no matter whether parents believe their kids should be able to have sex in their own homes, Schalet believes: “I think what you emphasize is that, above all, the conversation is important, and the conversation itself does not make kids have sex.” Ideally, she points out, that conversation will take place at home with parents, but a holistic talk about sexuality, relationships, and health can also take place in schools, with clergy, and in many other locations.

Mission accomplished! $20 worth of jalapeño cheetos
The phrase “you are what you eat” may refer to more than your physical make-up. In fact, the food in your fridge might say just as much about your social class as about your health.  Newsweek reports:

According to data released last week by the U.S. Department of Agriculture, 17 percent of Americans—more than 50 million people—live in households that are “food insecure,” a term that means a family sometimes runs out of money to buy food, or it sometimes runs out of food before it can get more money. Food insecurity is especially high in households headed by a single mother. It is most severe in the South, and in big cities. In New York City, 1.4 million people are food insecure, and 257,000 of them live near me, in Brooklyn. Food insecurity is linked, of course, to other economic measures like housing and employment, so it surprised no one that the biggest surge in food insecurity since the agency established the measure in 1995 occurred between 2007 and 2008, at the start of the economic downturn.

Growing inequality between the rich and the poor in the United States is reflected at the dinner table as well:

Among the lowest quintile of American families, mean household income has held relatively steady between $10,000 and $13,000 for the past two decades (in inflation-adjusted dollars); among the highest, income has jumped 20 percent to $170,800 over the same period, according to census data. What this means, in practical terms, is that the richest Americans can afford to buy berries out of season at Whole Foods—the upscale grocery chain that recently reported a 58 percent increase in its quarterly profits—while the food insecure often eat what they can: highly caloric, mass-produced foods like pizza and packaged cakes that fill them up quickly.

Using language evocative of sociologist Pierre Bourdieu, one epidemiologist explains:

Lower-income families don’t subsist on junk food and fast food because they lack nutritional education, as some have argued. And though many poor neighborhoods are, indeed, food deserts—meaning that the people who live there don’t have access to a well-stocked supermarket—many are not. Lower-income families choose sugary, fat, and processed foods because they’re cheaper—and because they taste good. In a paper published last spring, Drewnowski showed how the prices of specific foods changed between 2004 and 2008 based on data from Seattle-area supermarkets. While food prices overall rose about 25 percent, the most nutritious foods (red peppers, raw oysters, spinach, mustard greens, romaine lettuce) rose 29 percent, while the least nutritious foods (white sugar, hard candy, jelly beans, and cola) rose just 16 percent.

“In America,” Drewnowski wrote in an e-mail, “food has become the premier marker of social distinctions, that is to say—social class. It used to be clothing and fashion, but no longer, now that ‘luxury’ has become affordable and available to all.”

Concern about rising obesity, especially among low income communities, had led to some controversial policy proposals.

In recent weeks the news in New York City has been full with a controversial proposal to ban food-stamp recipients from using their government money to buy soda. Local public-health officials insist they need to be more proactive about slowing obesity; a recent study found that 40 percent of the children in New York City’s kindergarten through eighth-grade classrooms were either overweight or obese. (Nationwide, 36 percent of 6- to 11-year-olds are overweight or obese.)

But French sociologist Claude Fischler suggests that there might be a better way to address both food insecurity and obesity: Americans should be more French about food.

Americans take an approach to food and eating that is unlike any other people in history. For one thing, we regard food primarily as (good or bad) nutrition. When asked “What is eating well?” Americans generally answer in the language of daily allowances: they talk about calories and carbs, fats, and sugars. They don’t see eating as a social activity, and they don’t see food—as it has been seen for millennia—as a shared resource, like a loaf of bread passed around the table. When asked “What is eating well?” the French inevitably answer in terms of “conviviality”: togetherness, intimacy, and good tastes unfolding in a predictable way.

Even more idiosyncratic than our obsession with nutrition, says Fischler, is that Americans see food choice as a matter of personal freedom, an inalienable right. Americans want to eat what they want: morels or Big Macs. They want to eat where they want, in the car or alfresco. And they want to eat when they want. With the exception of Thanksgiving, when most of us dine off the same turkey menu, we are food libertarians. In surveys, Fischler has found no single time of day (or night) when Americans predictably sit together and eat. By contrast, 54 percent of the French dine at 12:30 each day. Only 9.5 percent of the French are obese.

Others suggest addressing systematic barriers to food accessibility and delivery. According to author and foodie icon Micahel Pollan:

“Essentially,” he says, “we have a system where wealthy farmers feed the poor crap and poor farmers feed the wealthy high-quality food.” He points to Walmart’s recent announcement of a program that will put more locally grown food on its shelves as an indication that big retailers are looking to sell fresh produce in a scalable way. These fruits and vegetables might not be organic, but the goal, says Pollan, is not to be absolutist in one’s food ideology. “I argue for being conscious,” he says, “but perfectionism is an enemy of progress.”

Community activists agree:

Food co-ops and community-garden associations are doing better urban outreach. Municipalities are establishing bus routes between poor neighborhoods and those where well-stocked supermarkets exist.

Joel Berg, executive director of the New York City Coalition Against Hunger, says these programs are good, but they need to go much, much further. He believes, like Fischler, that the answer lies in seeing food more as a shared resource, like water, than as a consumer product, like shoes. “It’s a nuanced conversation, but I think ‘local’ or ‘organic’ as the shorthand for all things good is way too simplistic,” says Berg. “I think we need a broader conversation about scale, working conditions, and environmental impact. It’s a little too much of people buying easy virtue.”re as well,” Berg says…

Berg believes that part of the answer lies in working with Big Food. The food industry hasn’t been entirely bad: it developed the technology to bring apples to Wisconsin in the middle of winter, after all. It could surely make sustainably produced fruits and vegetables affordable and available. “We need to bring social justice to bigger agriculture as well,” Berg says.

Wild Card WeekendRecent medical reports on the long-term effects of head injuries have resulted in increased concern about the medical risks of participating in football. While the N.F.L. has increasingly shown concern over the safety of its players, a solution has not been found. The safety issues came to a head this past Sunday when a number of players were injured as a result of highlight reel hits.

Michael Sokolove’s article in the New York Times examines the moral issues surrounding consuming a sport where players place themselves at such a high risk. As medical studies continue to build the link between head injuries in football and depression, suicide, and early death, Sokolove asks the timely question:

Is it morally defensible to watch a sport whose level of violence is demonstrably destructive? (The game, after all, must conform to consumer taste.) And where do we draw the line between sport and grotesque spectacle?

To provide insight into the question Sokolove turns to a series of cultural theorists and philosophers who have interest in the role of violent pursuits in society.

The writer Joyce Carol Oates has written admiringly of boxing, celebrating, among other aspects, the “incalculable and often self-destructive courage” of those who make their living in the ring. I wondered if she thought America’s football fans should have misgivings about sanctioning a game that, like boxing, leaves some of its participants neurologically impaired.

“There is invariably a good deal of hypocrisy in these judgments,” Ms. Oates responded by e-mail. “Supporting a war or even enabling warfare through passivity is clearly much more reprehensible than watching a football game or other dangerous sports like speed-car racing — but it may be that neither is an unambiguously ‘moral’ action of which one might be proud.”

Other ‘experts’ argue that the dangerous activity may serve a communal goal.

“We learn from dangerous activities,” said W. David Solomon, a philosophy professor at Notre Dame and director of its Center for Ethics and Culture. “In life, there are clearly focused goals, with real threats. The best games mirror that. We don’t need to feel bad about not turning away from a game in which serious injuries occur. There are worse things about me than that I enjoy a game that has violence in it. I don’t celebrate injuries or hope for them to happen. That would be a different issue. That’s moral perversion.”

Fellow philosopher Sean D. Kelly, the chairman of Harvard’s philosophy department, shares Solomon’s emphasis on the potential positive value of sports:

“You can experience a kind of spontaneous joy in watching someone perform an extraordinary athletic feat,” he said when we talked last week. “It’s life-affirming. It can expand our sense of what individuals are capable of.”

He believes that it is fine to watch football as long as the gravest injuries are a “side effect” of the game, rather than essential to whatever is good about the game and worth watching.

Sokolove concludes with the difficult question that football fans, as well as organizers and sponsors of the sport at all levels, must now ask themselves:

But what if that’s not the case? What if the brain injuries are so endemic — so resistant to changes in the rules and improvements in equipment — that the more we learn the more menacing the sport will seem?

love

With the flu season creeping in, we’re all looking for ways to improve our health.  In Windsor, sociologists Reza Nakhaie and Robert Arnold have found answer—love.

Sociology professor Reza Nakhaie and colleague Robert Arnold studied the effect of social capital — relationships with friends, family and community — on health.

Their findings, published recently in the journal Social Science and Medicine, reveal that warm fuzzies can actually do a body good.

The Montreal Gazzette elaborated on some of these findings.

Nakhaie and Arnold’s study showed that love is the key aspect of social capital affecting changes in health status.

The researchers’ definition of love included romantic love, familial love and divine love — the sense of loving and being loved by God. The main predictors of love were being married, monthly contact with family, attendance at religious services and being born in Canada.

Their study even found that the positive effects of love were three times stronger than the negative effects of daily smoking.  But,

Nakhaie and Arnold said their study isn’t just a feel-good story; it could have policy implications for the Canadian government. “Policies aimed at family support and family unification, for example, through immigration policies, (and) efforts to minimize the disruptions of divorce, appear important for the health of Canadians,” they wrote.

The researchers were quick to point out that we mustn’t stop worrying about meeting basic needs such as a stable food supply, and that the government shouldn’t cancel its anti-smoking programs. “What we’re really saying is that it’s time that the older sociological tradition of giving more attention to love was brought back to the forefront,” Arnold said.

Two weeks into Breast Cancer Awareness Month, the pink ribbons have been fluttering in full force. A New York Times blog urges a little reflection on the meaning of this now ubiquitous phenomenon:

The pink ribbon has been a spectacular success in terms of bringing recognition and funding to the breast cancer cause. But now there is a growing impatience about what some critics have termed “pink ribbon culture.” Medical sociologist Gayle A. Sulik, author of the new book “Pink Ribbon Blues: How Breast Cancer Culture Undermines Women’s Health” (Oxford University Press), calls it “the rise of pink October.”

“Pink ribbon paraphernalia saturate shopping malls, billboards, magazines, television and other entertainment venues,” she writes on her Web site. “The pervasiveness of the pink ribbon campaign leads many people to believe that the fight against breast cancer is progressing, when in truth it’s barely begun.”

The campaign builds on a long history of breast cancer activism, beginning in the 1970s, and now represents mainstream recognition of the cause.

So how can the pink ribbon be objectionable? Among the first salvos against the pink ribbon was a 2001 article in Harper’s magazine entitled “Welcome to Cancerland,” written by the well-known feminist author Barbara Ehrenreich. Herself a breast cancer patient, Ms. Ehrenreich delivered a scathing attack on the kitsch and sentimentality that she believed pervaded breast cancer activism.

A few additional critiques:

In “Pink Ribbon Blues,” Ms. Sulik offers three main objections to the pink ribbon. First, she worries that pink ribbon campaigns impose a model of optimism and uplift on women with breast cancer, although many such women actually feel cynicism, anger and similar emotions.

And like Ms. Ehrenreich, Ms. Sulik worries that the color pink reinforces stereotypical notions of gender — for example, that recovery from breast cancer necessarily entails having breast reconstruction, wearing makeup and “restoring the feminine body.”

Finally, Ms. Sulik closely examines what she calls the “financial incentives that keep the war on breast cancer profitable.” She reports that the Susan G. Komen Foundation, which annually sponsors over 125 annual Races for the Cure and more than a dozen three-day, 60-mile walks, has close to 200 corporate partners, including many drug companies. These associations, she warns, are a potential conflict of interest.

Read the rest.

A new study shows higher rates of suicide among middle age adults in recent years. CNN reports:

In the last 11 years, as more baby boomers entered midlife, the suicide rates in this age group have increased, according to an analysis in the September-October issue of the journal Public Health Reports.

The assumption was that “middle age was the most stable time of your life because you’re married, you’re settled, you had a job. Suicide rates are stable because their lives are stable,” said Dr. Paula Clayton, the medical director for the American Foundation for the Prevention of Suicide.

But this assumption may be shifting.

A sociologist explains:

“So many expected to be in better health and expected to be better off than they are,” said Julie Phillips, lead author of the study assessing recent changes in suicide rates. “Surveys suggest they had high expectations. Things haven’t worked out that way in middle age.”

Further,

Baby boomers (defined in the study as born between 1945 and 1964) are in a peculiar predicament.

“Historically, the elderly have had the highest rates of suicide,” said Phillips, a professor of sociology at Rutgers University. “What is so striking about these figures is that starting in 2005, suicide rates among the middle aged [45-64 years of age] are the highest of all age groups.”

The 45-54 age group had the highest suicide rate in 2006 and 2007, with 17.2 per 100,000. Meanwhile, suicide rates in adolescents and the elderly have begun to decline, she said.

“What’s notable here is that the recent trend among boomers is opposite to what we see among other cohorts and that it’s a reversal of a decades-long trend among the middle-aged,” said Phillips, who along with Ellen Idler, a sociologist at Emory University, and two other authors used data from the National Vital Statistics System.

Several theories have been proposed to explain this trend, including higher suicide rates among boomers during adolescence.

Baby boomers had higher rates of depression during their adolescence. One theory is that as they aged, this disposition followed them through the course of their lives.

“The age group as teenagers, it was identified they had higher rates of depression than people born 10 or 20 years earlier — it’s called a cohort effect,” said Clayton, from the American Foundation for the Prevention of Suicide, who read the study.

Others cite health concerns:

Some say health problems could be a factor in increased suicide rates among baby boomers.

Boomers have their share of medical problems such as high blood pressure, diabetes and complications of obesity.

“There’s a rise of chronic health conditions among the middle aged,” Phillips said. “In the time period from 1996 to 2006, we see fairly dramatic chronic health conditions and an increase in out-of-pocket expenditures.”

Some speculate that the increase in baby boomer suicides could be attributed to stress, the number of Vietnam veterans in the age group or drug use, which was higher in that generation. Boomers are also the “sandwich generation,” pressed between needs of their children and their aging parents who are living longer, but have health problems like Alzheimer’s or dementia.

Finally, economic woes may be to blame.

All this is unfolding in a lagging economy, meaning boomers could be affected by the “period effect.”

“One hypothesis is that the economic pressure during this period might be a driving force, with the recession in the early 2000s — loss of jobs, instability, increases in bankruptcy rates among middle age,” Phillips said.

Unemployment correlates with increased rates of suicide. People who are unmarried and have less education are also more at risk.

20100804_MissionDistrict_004
Here in the U.S., we are obsessed with weight.  It’s hard to even go one day without seeing an advertisement for the latest diet or a news story about a celebrity who shed some pounds or put on a few too many.  While this obsession is due in part to our focus on physical appearance, many of us link obesity with poor health outcomes, including death.  However, a recent social epidemiological study highlighted in Miller-McCune examined the factors that lead to early death; and obesity did not make the list.  Instead, those eager to prevent early death should avoid cigarettes, sedentary lifestyles, and even living in poverty.

This does not mean the lead author of the aforementioned study, Paula Lantz, is proposing we all relax and pig out. The University of Michigan social epidemiologist fully recognizes obesity as a national health problem. But her research suggests our current focus on weight is a bit (ahem) narrow and at least somewhat misleading.

Instead, we should look to what causes and exacerbates obesity, such as sugary sodas and our reliance on cars. And, while personal choices factor in, social class also plays a role.

It’s hard to take personal responsibility if you don’t have the money to join a gym and you have no access to healthy food in your immediate neighborhood. The place where you can get the most calories for the least money is McDonald’s. Their food is dirt cheap on a per-calorie basis.

In other words, being poor is hazardous to your health.

Stress processes probably play a role. Chronic stress is not good for immune function. [Difficulties with] housing, transportation, income security — all those factors can produce stress.  Do you have friends and family — people who can actually help you get to the doctor? Is your community organized in such a way that it provides the resources you need?

So, while a focus on obesity is important, we should start focusing on less prominent culprits like poverty.  And, in the meantime, exercise!

Pills 2
No one wants to be sad. This can generally be agreed on. However, as it becomes more and more common for anti-depressants and anti-anxiety medications to be prescribed, the question becomes what is a socially acceptable level of sadness for a well-functioning member of society to experience? There remains a blurry, but important line between what is considered ‘normal’ grieving and what is classified as a mental disorder or depression.  NPR’s Alix Spiegal recently explored a shift in this line due to changes in the criteria used by the American Psychiatric Association to diagnose depression.

Traditionally, the manual has steered doctors away from diagnosing major depression in people who have just lost a loved one in what’s called “bereavement exclusion.” The idea was that feelings of intense pain were normal, so they shouldn’t be labeled as a mental disorder.
But the new DSM changes this. Buried in the pages is a small but potentially potent alteration that has implications not only for people like Theresa, but ultimately for the way that we think about and understand the emotion of pain.
The DSM committee removed the bereavement exclusion — a small, almost footnote at the bottom of the section that describes the symptoms of major depression — from the manual.

Dr. Kenneth Kendler, a member of the committee behind the change, explains that grief and depression share the same symptoms – lack of sleep, loss of appetite, loss of energy. The key distinction between grief and depression is the amount of time the person experiences the symptoms.

In fact, in the new manual, if symptoms like these persist for more than two weeks, the bereaved person will be considered to have a mental disorder: major depression. And treatment, either therapy or medication, is recommended.

While Kendler believes that this change will only affect a small number, and for the better, Holly Prigerson a research at Harvard University believes otherwise.

“What we found,” Prigerson says, “is that when you follow people — for example, between zero and six months post-loss — their depression symptom levels actually increase over time and peak at about six months post-loss.”
Because grief and depression look so much alike, Prigerson says, she worries that people who are suffering from normal grief will be told that they are sick when they are not, and encouraged to treat their symptoms when they don’t need to.
That is potentially a problem, Prigerson says, because we don’t know whether the pain of normal grief actually helps people to process their loss.

Other experts expand Prigerson’s argument by voicing concern that society is continuing down a path to having an over-diagnosed and over-medicated population where to be sad is to be sick.

Dr. Allen Frances, the famous psychiatrist and a former editor of the DSM, says that more and more, psychiatry is medicalizing our experiences. That is, it is turning emotions that are perfectly normal into something pathological.
“Over the course of time, we’ve become looser in applying the term ‘mental disorder’ to the expectable aches and pains and sufferings of everyday life,” Frances says. “And always, we think about a medication treatment for each and every problem.”
From Frances’ perspective, if you can’t feel intense emotional pain in the wake of the death of your child without it being categorized as a mental disorder, then when in the course of human experience are you allowed to feel intense emotional pain for more than two weeks?