Bathroom SinkThe recent ending of several long-running daytime soap operas has social scientists discussing the reasons for this TV genre’s decline and its legacy. According to the Christian Science Monitor:

Soap operas, that staple of the daytime television schedule, have taken it on the chin lately. Two titans of the genre – “Guiding Light” and “As the World Turns,” ended impressive runs in the past year. “World,” which went dark Sept. 17, wrapped 54 years of fictional history for the folks of Oakdale, Ill. And “Light,” which began as a radio show in the 1930s, spanned nearly three-quarters of a century by the time it was dropped a year ago. These departures leave only six daytime “soaps” on the three broadcast TV networks (ABC, NBC, CBS), down from nearly two dozen at the height of demand for the daily serials.

One factor could be the move of more women into work outside the home.

The daily, quick serialized story, born and sponsored on radio by soap companies primarily to sell laundry products to housewives at home during the day, has evolved in lock step with the changing lives of that target female audience, says sociologist Lee Harrington from Miami University. “Serialized storytelling has been around for thousands of years but this particular, endless world of people, who could almost be your real neighbors they feel so temporal and all present, is disappearing,” she says, as women have moved into the workplace and out of the home during the day.

Adds a professor in communication studies:

These prime-time shows have incorporated the focus on character and emotion that endeared the soap operas to women, says Villanova University’s Susan Mackey-Kallis. But, she adds, just as women’s interests have expanded beyond the home to incorporate careers and public lives, “their taste in entertainment has expanded to include more interweaving of character with traditional plot-driven stories.”

But other experts are quick to acknowledge the debt owed to daytime soaps by other forms of television entertainment.

The handwriting began appearing on the wall as prime-time storytellers began to adapt the techniques of the daily soap to weekly evening dramas, which were predominantly episodic and plot-driven, says media expert Robert Thompson, founder of the Bleier Center for Television and Popular Culture at Syracuse University in Syracuse, N.Y. Seminal shows from “Hill Street Blues” through “The Sopranos” owe a debt to the character-heavy, serialized storytelling techniques of the soap opera genre, he adds.

“The daytime soaps really gave birth to the great narrative elements we now see in the highly developed prime-time dramas,” he points out.

Two weeks into Breast Cancer Awareness Month, the pink ribbons have been fluttering in full force. A New York Times blog urges a little reflection on the meaning of this now ubiquitous phenomenon:

The pink ribbon has been a spectacular success in terms of bringing recognition and funding to the breast cancer cause. But now there is a growing impatience about what some critics have termed “pink ribbon culture.” Medical sociologist Gayle A. Sulik, author of the new book “Pink Ribbon Blues: How Breast Cancer Culture Undermines Women’s Health” (Oxford University Press), calls it “the rise of pink October.”

“Pink ribbon paraphernalia saturate shopping malls, billboards, magazines, television and other entertainment venues,” she writes on her Web site. “The pervasiveness of the pink ribbon campaign leads many people to believe that the fight against breast cancer is progressing, when in truth it’s barely begun.”

The campaign builds on a long history of breast cancer activism, beginning in the 1970s, and now represents mainstream recognition of the cause.

So how can the pink ribbon be objectionable? Among the first salvos against the pink ribbon was a 2001 article in Harper’s magazine entitled “Welcome to Cancerland,” written by the well-known feminist author Barbara Ehrenreich. Herself a breast cancer patient, Ms. Ehrenreich delivered a scathing attack on the kitsch and sentimentality that she believed pervaded breast cancer activism.

A few additional critiques:

In “Pink Ribbon Blues,” Ms. Sulik offers three main objections to the pink ribbon. First, she worries that pink ribbon campaigns impose a model of optimism and uplift on women with breast cancer, although many such women actually feel cynicism, anger and similar emotions.

And like Ms. Ehrenreich, Ms. Sulik worries that the color pink reinforces stereotypical notions of gender — for example, that recovery from breast cancer necessarily entails having breast reconstruction, wearing makeup and “restoring the feminine body.”

Finally, Ms. Sulik closely examines what she calls the “financial incentives that keep the war on breast cancer profitable.” She reports that the Susan G. Komen Foundation, which annually sponsors over 125 annual Races for the Cure and more than a dozen three-day, 60-mile walks, has close to 200 corporate partners, including many drug companies. These associations, she warns, are a potential conflict of interest.

Read the rest.

In recent years, membership in civic associations has declined. Despite this trend, some groups like the Sierra Club and the Rotary Club–which have clearly-defined, narrow focuses–not only persist, but thrive.

Previous research has focused on political and financial factors as variables predicting the effectiveness of civic associations. However, Kenneth T. Andrews and his co-authors (American Journal of Sociology, January 2010) argue that these variables’ effects are modest compared to the “human” factors of leader development and member engagement. This is due to civic associations’ unique need for many leaders at all levels of the organization and their dependence upon the voluntary efforts of their members.

After conducting telephone interviews with 368 Sierra Club Executive Committee chairs and administering 1,624 surveys to committee members, the authors found that the quantity of resources available to a civic association is useless unless it is recognized for its value and used appropriately by its leaders and members. An engaged membership is also more likely encourage strong programmatic activity and foster independent leadership.

These findings suggest that civic associations looking to boost the effectiveness of their programs should focus on developing and nurturing “activist” members, not just on fund-raising or maintaining government lobbyists.

A new study shows higher rates of suicide among middle age adults in recent years. CNN reports:

In the last 11 years, as more baby boomers entered midlife, the suicide rates in this age group have increased, according to an analysis in the September-October issue of the journal Public Health Reports.

The assumption was that “middle age was the most stable time of your life because you’re married, you’re settled, you had a job. Suicide rates are stable because their lives are stable,” said Dr. Paula Clayton, the medical director for the American Foundation for the Prevention of Suicide.

But this assumption may be shifting.

A sociologist explains:

“So many expected to be in better health and expected to be better off than they are,” said Julie Phillips, lead author of the study assessing recent changes in suicide rates. “Surveys suggest they had high expectations. Things haven’t worked out that way in middle age.”

Further,

Baby boomers (defined in the study as born between 1945 and 1964) are in a peculiar predicament.

“Historically, the elderly have had the highest rates of suicide,” said Phillips, a professor of sociology at Rutgers University. “What is so striking about these figures is that starting in 2005, suicide rates among the middle aged [45-64 years of age] are the highest of all age groups.”

The 45-54 age group had the highest suicide rate in 2006 and 2007, with 17.2 per 100,000. Meanwhile, suicide rates in adolescents and the elderly have begun to decline, she said.

“What’s notable here is that the recent trend among boomers is opposite to what we see among other cohorts and that it’s a reversal of a decades-long trend among the middle-aged,” said Phillips, who along with Ellen Idler, a sociologist at Emory University, and two other authors used data from the National Vital Statistics System.

Several theories have been proposed to explain this trend, including higher suicide rates among boomers during adolescence.

Baby boomers had higher rates of depression during their adolescence. One theory is that as they aged, this disposition followed them through the course of their lives.

“The age group as teenagers, it was identified they had higher rates of depression than people born 10 or 20 years earlier — it’s called a cohort effect,” said Clayton, from the American Foundation for the Prevention of Suicide, who read the study.

Others cite health concerns:

Some say health problems could be a factor in increased suicide rates among baby boomers.

Boomers have their share of medical problems such as high blood pressure, diabetes and complications of obesity.

“There’s a rise of chronic health conditions among the middle aged,” Phillips said. “In the time period from 1996 to 2006, we see fairly dramatic chronic health conditions and an increase in out-of-pocket expenditures.”

Some speculate that the increase in baby boomer suicides could be attributed to stress, the number of Vietnam veterans in the age group or drug use, which was higher in that generation. Boomers are also the “sandwich generation,” pressed between needs of their children and their aging parents who are living longer, but have health problems like Alzheimer’s or dementia.

Finally, economic woes may be to blame.

All this is unfolding in a lagging economy, meaning boomers could be affected by the “period effect.”

“One hypothesis is that the economic pressure during this period might be a driving force, with the recession in the early 2000s — loss of jobs, instability, increases in bankruptcy rates among middle age,” Phillips said.

Unemployment correlates with increased rates of suicide. People who are unmarried and have less education are also more at risk.

NaptimeA recent story in the Star Tribune explores the recently documented trend of women delaying the birth of their first child or choosing to not have children altogether.

More than ever before, women are deciding to forgo childbearing in favor of other life-fulfilling experiences, a trend that has been steadily on the rise for decades. Census data says that nationally, the number of women 40 to 44 who did not have children jumped 10 percentage points from 1983 to 2006.

As University of Minnesota sociologist Ross Macmillan explains, the childless trend is not limited to the United States.

The number of children born is dropping “like a stone in pretty much every country we
can find,” he said, and the United States has seen a 50-year rise in the number of  childless women.

There are also a large number of women choosing to delay childbirth. State Demographic Center research analyst Martha McMurry points out that while there has been a decline in births among women in their 20s, the number of women having children in their 30s and 40s is increasing.

This delay is in part attributed to the high cost of having and raising a child, estimated at $250,000 by some studies,  as well as the potential negative repercussions in the workplace.

“Actually, while it is true that women can have it all, it is also true that women who have children suffer from some penalties in the workplace,” said University of Minnesota associate professor Ann Meier.

She was referencing Stanford sociologist Shelley Correll’s research that shows that mothers looking for work are less likely to be hired, are offered lower pay (5 percent less per child) and that the pay gap between mothers and childless women under 35 is
actually bigger than the pay gap between women and men.

As the numbers of women choosing not to have children has risen, groups organized around the decision have sprung up.

In the Twin Cities, a one-year-old Childfree by Choice group’s numbers are growing
weekly. On Meetup.com, the site through which it is organized, other such groups are
cropping up nationwide, with such names as No Kidding and Not a Mom.

For many of these women children are simply not seen as the key ingredient to living a good life.

Aleja Santos, 44, a medical ethics researcher who started the Twin Cities Childfree by
Choice group a year ago (greeting members on the site with “Welcome, fellow non-
breeders!”), said she never wanted to have kids. “There were always other things I
wanted to do.”

PA010049

How do businesses affect neighborhood crime rates?  Some people would answer this question by asserting that the increased foot traffic that businesses bring to neighborhoods translates into more eyes to curb crime.  According to others, residents withdraw into their homes to avoid crowds, which could make crimes more likely. 

To test these opposing ideas, Christopher Browning and his Ohio State colleagues examined 1999-2001 rates of homicide, aggravated assault and robbery in 184 census tracts in Columbus, Ohio; and Psych Central News reported on their findings.

Neighborhoods that combine residential and business developments have lower levels of some types of violent crime[homocide and aggravated assault]…The findings were equally true in impoverished areas as they were in more affluent neighborhoods, possibly offering city planners and politicians a new option in improving crime-afflicted areas, according to the researchers.

But, neighborhood density also plays a role.

In sparsely populated neighborhoods, increases in business-residential density initially lead to more frequent violent crimes.  However, once the building density reached a certain threshold, certain types of violent crime began to decline.

As Christopher Browning put it, “A residential neighborhood needs more than the addition of one or two businesses to see any positive impact on violent crime.”

The researchers are hopeful that bringing businesses into neighborhoods could help cut back on some violent crimes.

Defining a family has legal significance, of course, for matters such as taxes or employee benefits, but this question is even more complex when trying to understand how people think about what constitutes a family, more generally. Understanding which types of arrangements “count” as a family and which do not reveals a lot about shifting cultural expectations and social norms.

New research by Brian Powell, reported by ABC News, suggests that having children is a key ingredient for many people in defining a family, particularly when asked about unmarried or same-sex couples.

“Children provide this, quote, ‘guarantee’ that move you to family status,” Powell said. “Having children signals something. It signals that there really is a commitment and a sense of responsibility in a family.”

For instance, 39.6 percent in 2010 said that an unmarried man and woman living together were a family — but give that couple some kids and 83 percent say that’s a family.

Thirty-three percent said a gay male couple was a family. Sixty-four percent said they became a family when they added children.

However, despite what labels others may place on you, most respondents thought self-identification was more important:

Sixty percent of Americans in 2010 said that if you considered yourself to be a family, then you were one.

As the 5-year anniversary of Hurricane Katrina approaches, Salon‘s Matt Davis examined the New Orleans of today.  Unlike much of the nation, New Orleans has recently being going through an economic boom.   The number of economically disadvantaged people in the Orleans Parish has halved to 68,000 over the last five years, and the median household income has been rising.

Yet, these statistics are not as positive as they seem.  Instead, they are largely the result of poor residents leaving New Orleans after Katrina and not returning.

“By most measures, it’s quite clear that the 100,000 people who are missing are the poorest and darkest former residents of the city,” says Rachel Luft, professor of sociology at the University of New Orleans. “And they are being replaced by a slew of YURPs, or young urban redevelopment professionals, who tend to be whiter, wealthier and better educated than the traditional residents of New Orleans. I think they’re being held up as the great white hope for rebuilding the city.”

Many of these “YURPs” are participating in volunteer programs like Teach for America.  Others are participating in celebrity-run charities like Brad Pitt’s organization.

…Brad Pitt’s charity, the Make It Right foundation, has acquired the nickname the “Make It White” foundation, and has drawn quiet criticism for foisting $350,000 Frank Gehry-designed houses on poor black property owners in the Lower Ninth Ward, who may well, at some point, see an incentive to sell out and realize the nonprofit’s equity in their homes.

Today, New Orleans hosts 354,850 residents, which is almost 78% of its pre-Katrina population.  Yet, only 60% of these residents are black, compared to 67% before the storm.

NO PAIN, NO GAIN

In a recent thought piece titled, “Racing Safely to the Finish Line? Kids, Competitions, and Injuries,” Sociologist Hilary Levey, reflects upon the reaction to the recent death of thirteen-year-old Peter Lenz this past Sunday. Peter was killed in a motorcycle accident at the Indianapolis Motor Speedway during a practice session.

Levey explains that it would be an error for the public to be caught up in the type of accident that occurred and we should instead use this tragedy as an impetus to consider the dangers of increasingly competitive youth sport.

Youth racing shouldn’t be alone in getting a closer inspection. This tragedy could have happened to any girl on a balance beam or any boy in a football tackle last Sunday. We should not be distracted by the fact that Peter was in a motorcycle race.

Despite the risk of serious injuries, like concussions, and even death, millions of kids compete in almost any activity you can imagine. Did you know that there are shooting contests for young Davy Crocketts, a racing circuit for aspiring Danica Patricks, and a youth PGA for those pursuing Tiger Woods’ swing? When did American childhood become not just hyper-organized but also hyper-competitive?

Levey shows that youth sport should be examined as the culmination of a century long trajectory of increased competitiveness.

Initially the organized activities served as a way mitigate deviant behavior by reducing the amount of unmonitored idle hours.

In 1903 New York City’s Public School Athletic League for Boys was established and contests between children, organized by adults, emerged as a way to keep the boys coming back to activities and clubs. Settlement houses and ethnic clubs followed suit and the number of these clubs grew rapidly through the 1920s.

However, the level of competitiveness continued to ramp up as the 20th century progressed. National organizations were introduced after World War II and the by the 1970s, for-profit organizations were common.

And, by the turn of the twenty-first century, a variety of year-round competitive circuits, run by paid organizers and coaches, dominated families’ evenings and weekends.

Parents tried to find the activity best suited to turn their children into national champions, even at age seven. As competitive children’s activities became increasingly organized over the twentieth century, injuries increased — especially overuse injuries and concussions. More practice time, an earlier focus on only one sport, and a higher level of intensity in games create the environment for these types of injuries.

Peter Lenz’s death is indicative of an increasingly competitive and organized American childhood. Levey argues that as a society we have the responsibility to make sure the training and safety regulations keep up with the increased pressure and risk of injury. This should include greater monitoring of safety equipment and higher standards for coaches.

While catastrophic accidents like Peter Lenz’s will happen, we can work to better protect all competitive children from more common injuries like concussions and overuse injuries. Kids want to win whatever race they are in and be the champion. Adults should make sure they all safely cross the finish line.

France & Ewing in South Minneapolis

A recent feature in the University of Minnesota’s UMNews report documents Rebecca Krinke’s most recent public art creation. Krinke, an associate professor in landscape architecture, explores how memories and emotion become attached to specific spatial locations. In doing so she blurs the line between geography, sociology, urban studies, emotional exploration, and art.

The map has turned into a sociology experiment of sorts and a sounding board for people’s emotions: hope and despair, contentment and anger, love and hate.

Krinke began with a giant laser-cut map of Minneapolis and St. Paul.

Beginning in late July, Krinke started taking the map to public spaces in Minneapolis and St. Paul and inviting passersby to use the colored pencil of their choice—gold for joy and gray for pain (or both)—to express their memories of places.

The map soon was filled with color – some representing memories of excitement and wonder, others representing tragedy and grief.

One man was sharing his tale of overdosing on heroin in Minneapolis when another chimed in and said, “Yeah, that happened to me, too,” Krinke says. “And they looked at each other like, ‘Well, we made it.’”

Fortunately, the map still radiates more than its share of good times and golden memories. Of fish caught in Minneapolis lakes. Of trails hiked and biked over and over again. Of sports venues old and new.

The overwhelming reaction to the piece has inspired Krinke to look for ways to continue, and expand, the project. It also points to some sort of underlying desire to make public emotions that rarely see the light of day.

As artists and designers, “there’s a lot of potential here,” she adds. “Maybe we’re the witnesses. Maybe that’s why they like talking. It’s like testifying in a way. I guess [it’s] a deep fundamental human need to be heard.”