Facebook

The website that many of you will visit after this one (or may have already visited!) is providing sociologists with new research opportunities.

Andreas Wimmer and Kevin Lewis used facebook to study friendships among college freshman and found that race’s impact on friendships may be overstated.

“Sociologists have long maintained that race is the strongest predictor of whether two Americans will socialize,” says lead author Andreas Wimmer, professor of sociology at UCLA. “But we’ve found that birds of a feather don’t always flock together. Whom you get to know in your everyday life, where you live, and your country of origin or social class can provide stronger grounds for forging friendships than a shared racial background.”

To reach these conclusions, Wimmer and Lewis studied the social networks of college freshmen by examining tagged photos on facebook.

True to past research, the sociologists initially saw same-race friendships develop rapidly: White students befriended each other one-and-a-half times more frequently than would be expected by chance, Latino students befriended each other four-and-a-half times more frequently, and African American students befriended each other eight times more frequently. But when the researchers dug deeper, race appeared to be less important than a number of other factors in forging friendships.

“Much of what at first appeared to be same-race preference, for instance, ultimately proved to be preference for students of the same ethnic background,” Lewis says. “Once we started controlling for the attraction of shared ethnic backgrounds or countries of origin, the magnitude of racial preference was cut almost in half.”

While Wimmer and Lewis stress that racial discrimination is still a problem, they believe past research may have exaggerated the role of race in social relationships.  Instead, social and physical constraints play a bigger role.

To read the entire article, click here.

Ale

Many of our posts focus on colleagues’ research, but teaching is also newsworthy.  The student newspaper at the University of South Carolina recently ran an article about an upcoming sociology course at USC—Lady Gaga and the Sociology of Fame.

The professor, Mathieu Deflem, explained the idea behind the course:

“We’re going to look at Lady Gaga as a social event,” Deflem said. “So it’s not the person, and it’s not the music. It’s more this thing out there in society that has 10 million followers on Facebook and six million on Twitter. I mean, that’s a social phenomenon. It’s a global social phenomenon. So the central question of the course is, this fame, which is ironically also the theme of her first records, how can it be accounted for? What are some of the mechanisms and some of the conditions of Lady Gaga’s rise to popularity?”

Deflem added that another key question of the course is, “What does it mean, and how does a person become famous?”

Deflem usually teaches criminology, the sociology of law, and policing, but he is excited to examine the popular social phenomenon in a sociological light.

In the beginning, the course will deal with the sociology of popularity in general. The first couple weeks probably won’t be about Lady Gaga at all. But then the Gaga scenario will be used as a real-life example detailing sociological traits. More specific information about the course content can be found at gagacourse.net – a site Deflem has already created for the class.

Wild Card WeekendRecent medical reports on the long-term effects of head injuries have resulted in increased concern about the medical risks of participating in football. While the N.F.L. has increasingly shown concern over the safety of its players, a solution has not been found. The safety issues came to a head this past Sunday when a number of players were injured as a result of highlight reel hits.

Michael Sokolove’s article in the New York Times examines the moral issues surrounding consuming a sport where players place themselves at such a high risk. As medical studies continue to build the link between head injuries in football and depression, suicide, and early death, Sokolove asks the timely question:

Is it morally defensible to watch a sport whose level of violence is demonstrably destructive? (The game, after all, must conform to consumer taste.) And where do we draw the line between sport and grotesque spectacle?

To provide insight into the question Sokolove turns to a series of cultural theorists and philosophers who have interest in the role of violent pursuits in society.

The writer Joyce Carol Oates has written admiringly of boxing, celebrating, among other aspects, the “incalculable and often self-destructive courage” of those who make their living in the ring. I wondered if she thought America’s football fans should have misgivings about sanctioning a game that, like boxing, leaves some of its participants neurologically impaired.

“There is invariably a good deal of hypocrisy in these judgments,” Ms. Oates responded by e-mail. “Supporting a war or even enabling warfare through passivity is clearly much more reprehensible than watching a football game or other dangerous sports like speed-car racing — but it may be that neither is an unambiguously ‘moral’ action of which one might be proud.”

Other ‘experts’ argue that the dangerous activity may serve a communal goal.

“We learn from dangerous activities,” said W. David Solomon, a philosophy professor at Notre Dame and director of its Center for Ethics and Culture. “In life, there are clearly focused goals, with real threats. The best games mirror that. We don’t need to feel bad about not turning away from a game in which serious injuries occur. There are worse things about me than that I enjoy a game that has violence in it. I don’t celebrate injuries or hope for them to happen. That would be a different issue. That’s moral perversion.”

Fellow philosopher Sean D. Kelly, the chairman of Harvard’s philosophy department, shares Solomon’s emphasis on the potential positive value of sports:

“You can experience a kind of spontaneous joy in watching someone perform an extraordinary athletic feat,” he said when we talked last week. “It’s life-affirming. It can expand our sense of what individuals are capable of.”

He believes that it is fine to watch football as long as the gravest injuries are a “side effect” of the game, rather than essential to whatever is good about the game and worth watching.

Sokolove concludes with the difficult question that football fans, as well as organizers and sponsors of the sport at all levels, must now ask themselves:

But what if that’s not the case? What if the brain injuries are so endemic — so resistant to changes in the rules and improvements in equipment — that the more we learn the more menacing the sport will seem?

Montréal-Nord

Patricia Cohen’s recent article in the NY Times, “‘Culture of Poverty’ Makes a Comeback,” documents culture once again being used by social scientists as an explanation in discussing poverty.

Cohen begins by setting the historical context.

The reticence was a legacy of the ugly battles that erupted after Daniel Patrick Moynihan, then an assistant labor secretary in the Johnson administration, introduced the idea of a “culture of poverty” to the public in a startling 1965 report. Although Moynihan didn’t coin the phrase (that distinction belongs to the anthropologist Oscar Lewis), his description of the urban black family as caught in an inescapable “tangle of pathology” of unmarried mothers and welfare dependency was seen as attributing self-perpetuating moral deficiencies to black people, as if blaming them for their own misfortune.

The idea was soon central to many of the conservative critiques of government aid for the needy. Within the generally liberal fields of sociology and anthropology the argument was generally treated as being in poor taste and avoided. This time of silence seems to be drawing to a close.

“We’ve finally reached the stage where people aren’t afraid of being politically incorrect,” said Douglas S. Massey, a sociologist at Princeton who has argued that Moynihan was unfairly maligned.

The new wave of culture-oriented discussions is not a direct replica of the studies of the 1960s.

Today, social scientists are rejecting the notion of a monolithic and unchanging culture of poverty. And they attribute destructive attitudes and behavior not to inherent moral character but to sustained racism and isolation.

Cohen continues by providing examples of how culture is now being examined. To do so she turns to Harvard sociologist, Robert J. Sampson. According to Sampson culture should be understood as “shared understandings.”

The shared perception of a neighborhood — is it on the rise or stagnant? — does a better job of predicting a community’s future than the actual level of poverty, he said.

William Julius Wilson, a fellow Harvard sociologist who achieved notoriety through studies of persistent poverty defines culture as the way

“individuals in a community develop an understanding of how the world works and make decisions based on that understanding.”

For some young black men, Professor Wilson said, the world works like this: “If you don’t develop a tough demeanor, you won’t survive. If you have access to weapons, you get them, and if you get into a fight, you have to use them.”

As a result of this new direction in the study of poverty, a number of assumptions about people in poverty have been challenged. One of these is idea marriage is not valued by poor, urban single mothers.

In Philadelphia, for example, low-income mothers told the sociologists Kathryn Edin and Maria Kefalas that they thought marriage was profoundly important, even sacred, but doubted that their partners were “marriage material.” Their results have prompted some lawmakers and poverty experts to conclude that programs that promote marriage without changing economic and social conditions are unlikely to work.

The question remains, why are social scientists suddenly willing to deal with this once taboo approach?

Younger academics like Professor Small, 35, attributed the upswing in cultural explanations to a “new generation of scholars without the baggage of that debate.”

Scholars like Professor Wilson, 74, who have tilled the field much longer, mentioned the development of more sophisticated data and analytical tools. He said he felt compelled to look more closely at culture after the publication of Charles Murray and Richard Herrnstein’s controversial 1994 book, “The Bell Curve,” which attributed African-Americans’ lower I.Q. scores to genetics.

The authors claimed to have taken family background into account, Professor Wilson said, but “they had not captured the cumulative effects of living in poor, racially segregated neighborhoods.”

He added, “I realized we needed a comprehensive measure of the environment, that we must consider structural and cultural forces.”

This surge of interest is particularly timely as poverty in the United States has hit a fifteen-year high. And the debate is by no means confined to the ‘Ivory Tower’.

The topic has generated interest on Capitol Hill because so much of the research intersects with policy debates. Views of the cultural roots of poverty “play important roles in shaping how lawmakers choose to address poverty issues,” Representative Lynn Woolsey, Democrat of California, noted at the briefing.

Morningside Heights/HarlemSince the 1960s, sociologists have shied away from explaining the persistence of poverty in terms of cultural factors, instead emphasizing the social structures that create and perpetuate poverty. Now, the New York Times reports, there seems to be a resurgence of analysis linking culture and persistent poverty.

The old debate has shaped the new. Last month Princeton and the Brookings Institution released a collection of papers on unmarried parents, a subject, it noted, that became off-limits after the Moynihan report. At the recent annual meeting of the American Sociological Association, attendees discussed the resurgence of scholarship on culture. And in Washington last spring, social scientists participated in a Congressional briefing on culture and poverty linked to a special issue of The Annals, the journal of the American Academy of Political and Social Science.

This, however, is not a reproduction of ‘culture of poverty’ scholarship; current work is significantly different:

With these studies come many new and varied definitions of culture, but they all differ from the ’60s-era model in these crucial respects: Today, social scientists are rejecting the notion of a monolithic and unchanging culture of poverty. And they attribute destructive attitudes and behavior not to inherent moral character but to sustained racism and isolation.

Harvard sociologist Robert J. Sampson says that how people collectively view their community matters.

The shared perception of a neighborhood — is it on the rise or stagnant? — does a better job of predicting a community’s future than the actual level of poverty, he said.

Sociologists try to unpack what this means:

Seeking to recapture the topic from economists, sociologists have ventured into poor neighborhoods to delve deeper into the attitudes of residents. Their results have challenged some common assumptions, like the belief that poor mothers remain single because they don’t value marriage.

In Philadelphia, for example, low-income mothers told the sociologists Kathryn Edin and Maria Kefalas that they thought marriage was profoundly important, even sacred, but doubted that their partners were “marriage material.” Their results have prompted some lawmakers and poverty experts to conclude that programs that promote marriage without changing economic and social conditions are unlikely to work.

The article speculates about several reasons why a cultural approach to studying poverty is reemerging, including a new generation of scholars, advancements in data collection and analysis, and shifts in broader discourse and attitudes outside the university, as well.

Take a look at the full article.

love

With the flu season creeping in, we’re all looking for ways to improve our health.  In Windsor, sociologists Reza Nakhaie and Robert Arnold have found answer—love.

Sociology professor Reza Nakhaie and colleague Robert Arnold studied the effect of social capital — relationships with friends, family and community — on health.

Their findings, published recently in the journal Social Science and Medicine, reveal that warm fuzzies can actually do a body good.

The Montreal Gazzette elaborated on some of these findings.

Nakhaie and Arnold’s study showed that love is the key aspect of social capital affecting changes in health status.

The researchers’ definition of love included romantic love, familial love and divine love — the sense of loving and being loved by God. The main predictors of love were being married, monthly contact with family, attendance at religious services and being born in Canada.

Their study even found that the positive effects of love were three times stronger than the negative effects of daily smoking.  But,

Nakhaie and Arnold said their study isn’t just a feel-good story; it could have policy implications for the Canadian government. “Policies aimed at family support and family unification, for example, through immigration policies, (and) efforts to minimize the disruptions of divorce, appear important for the health of Canadians,” they wrote.

The researchers were quick to point out that we mustn’t stop worrying about meeting basic needs such as a stable food supply, and that the government shouldn’t cancel its anti-smoking programs. “What we’re really saying is that it’s time that the older sociological tradition of giving more attention to love was brought back to the forefront,” Arnold said.

Bathroom SinkThe recent ending of several long-running daytime soap operas has social scientists discussing the reasons for this TV genre’s decline and its legacy. According to the Christian Science Monitor:

Soap operas, that staple of the daytime television schedule, have taken it on the chin lately. Two titans of the genre – “Guiding Light” and “As the World Turns,” ended impressive runs in the past year. “World,” which went dark Sept. 17, wrapped 54 years of fictional history for the folks of Oakdale, Ill. And “Light,” which began as a radio show in the 1930s, spanned nearly three-quarters of a century by the time it was dropped a year ago. These departures leave only six daytime “soaps” on the three broadcast TV networks (ABC, NBC, CBS), down from nearly two dozen at the height of demand for the daily serials.

One factor could be the move of more women into work outside the home.

The daily, quick serialized story, born and sponsored on radio by soap companies primarily to sell laundry products to housewives at home during the day, has evolved in lock step with the changing lives of that target female audience, says sociologist Lee Harrington from Miami University. “Serialized storytelling has been around for thousands of years but this particular, endless world of people, who could almost be your real neighbors they feel so temporal and all present, is disappearing,” she says, as women have moved into the workplace and out of the home during the day.

Adds a professor in communication studies:

These prime-time shows have incorporated the focus on character and emotion that endeared the soap operas to women, says Villanova University’s Susan Mackey-Kallis. But, she adds, just as women’s interests have expanded beyond the home to incorporate careers and public lives, “their taste in entertainment has expanded to include more interweaving of character with traditional plot-driven stories.”

But other experts are quick to acknowledge the debt owed to daytime soaps by other forms of television entertainment.

The handwriting began appearing on the wall as prime-time storytellers began to adapt the techniques of the daily soap to weekly evening dramas, which were predominantly episodic and plot-driven, says media expert Robert Thompson, founder of the Bleier Center for Television and Popular Culture at Syracuse University in Syracuse, N.Y. Seminal shows from “Hill Street Blues” through “The Sopranos” owe a debt to the character-heavy, serialized storytelling techniques of the soap opera genre, he adds.

“The daytime soaps really gave birth to the great narrative elements we now see in the highly developed prime-time dramas,” he points out.

Two weeks into Breast Cancer Awareness Month, the pink ribbons have been fluttering in full force. A New York Times blog urges a little reflection on the meaning of this now ubiquitous phenomenon:

The pink ribbon has been a spectacular success in terms of bringing recognition and funding to the breast cancer cause. But now there is a growing impatience about what some critics have termed “pink ribbon culture.” Medical sociologist Gayle A. Sulik, author of the new book “Pink Ribbon Blues: How Breast Cancer Culture Undermines Women’s Health” (Oxford University Press), calls it “the rise of pink October.”

“Pink ribbon paraphernalia saturate shopping malls, billboards, magazines, television and other entertainment venues,” she writes on her Web site. “The pervasiveness of the pink ribbon campaign leads many people to believe that the fight against breast cancer is progressing, when in truth it’s barely begun.”

The campaign builds on a long history of breast cancer activism, beginning in the 1970s, and now represents mainstream recognition of the cause.

So how can the pink ribbon be objectionable? Among the first salvos against the pink ribbon was a 2001 article in Harper’s magazine entitled “Welcome to Cancerland,” written by the well-known feminist author Barbara Ehrenreich. Herself a breast cancer patient, Ms. Ehrenreich delivered a scathing attack on the kitsch and sentimentality that she believed pervaded breast cancer activism.

A few additional critiques:

In “Pink Ribbon Blues,” Ms. Sulik offers three main objections to the pink ribbon. First, she worries that pink ribbon campaigns impose a model of optimism and uplift on women with breast cancer, although many such women actually feel cynicism, anger and similar emotions.

And like Ms. Ehrenreich, Ms. Sulik worries that the color pink reinforces stereotypical notions of gender — for example, that recovery from breast cancer necessarily entails having breast reconstruction, wearing makeup and “restoring the feminine body.”

Finally, Ms. Sulik closely examines what she calls the “financial incentives that keep the war on breast cancer profitable.” She reports that the Susan G. Komen Foundation, which annually sponsors over 125 annual Races for the Cure and more than a dozen three-day, 60-mile walks, has close to 200 corporate partners, including many drug companies. These associations, she warns, are a potential conflict of interest.

Read the rest.

In recent years, membership in civic associations has declined. Despite this trend, some groups like the Sierra Club and the Rotary Club–which have clearly-defined, narrow focuses–not only persist, but thrive.

Previous research has focused on political and financial factors as variables predicting the effectiveness of civic associations. However, Kenneth T. Andrews and his co-authors (American Journal of Sociology, January 2010) argue that these variables’ effects are modest compared to the “human” factors of leader development and member engagement. This is due to civic associations’ unique need for many leaders at all levels of the organization and their dependence upon the voluntary efforts of their members.

After conducting telephone interviews with 368 Sierra Club Executive Committee chairs and administering 1,624 surveys to committee members, the authors found that the quantity of resources available to a civic association is useless unless it is recognized for its value and used appropriately by its leaders and members. An engaged membership is also more likely encourage strong programmatic activity and foster independent leadership.

These findings suggest that civic associations looking to boost the effectiveness of their programs should focus on developing and nurturing “activist” members, not just on fund-raising or maintaining government lobbyists.

A new study shows higher rates of suicide among middle age adults in recent years. CNN reports:

In the last 11 years, as more baby boomers entered midlife, the suicide rates in this age group have increased, according to an analysis in the September-October issue of the journal Public Health Reports.

The assumption was that “middle age was the most stable time of your life because you’re married, you’re settled, you had a job. Suicide rates are stable because their lives are stable,” said Dr. Paula Clayton, the medical director for the American Foundation for the Prevention of Suicide.

But this assumption may be shifting.

A sociologist explains:

“So many expected to be in better health and expected to be better off than they are,” said Julie Phillips, lead author of the study assessing recent changes in suicide rates. “Surveys suggest they had high expectations. Things haven’t worked out that way in middle age.”

Further,

Baby boomers (defined in the study as born between 1945 and 1964) are in a peculiar predicament.

“Historically, the elderly have had the highest rates of suicide,” said Phillips, a professor of sociology at Rutgers University. “What is so striking about these figures is that starting in 2005, suicide rates among the middle aged [45-64 years of age] are the highest of all age groups.”

The 45-54 age group had the highest suicide rate in 2006 and 2007, with 17.2 per 100,000. Meanwhile, suicide rates in adolescents and the elderly have begun to decline, she said.

“What’s notable here is that the recent trend among boomers is opposite to what we see among other cohorts and that it’s a reversal of a decades-long trend among the middle-aged,” said Phillips, who along with Ellen Idler, a sociologist at Emory University, and two other authors used data from the National Vital Statistics System.

Several theories have been proposed to explain this trend, including higher suicide rates among boomers during adolescence.

Baby boomers had higher rates of depression during their adolescence. One theory is that as they aged, this disposition followed them through the course of their lives.

“The age group as teenagers, it was identified they had higher rates of depression than people born 10 or 20 years earlier — it’s called a cohort effect,” said Clayton, from the American Foundation for the Prevention of Suicide, who read the study.

Others cite health concerns:

Some say health problems could be a factor in increased suicide rates among baby boomers.

Boomers have their share of medical problems such as high blood pressure, diabetes and complications of obesity.

“There’s a rise of chronic health conditions among the middle aged,” Phillips said. “In the time period from 1996 to 2006, we see fairly dramatic chronic health conditions and an increase in out-of-pocket expenditures.”

Some speculate that the increase in baby boomer suicides could be attributed to stress, the number of Vietnam veterans in the age group or drug use, which was higher in that generation. Boomers are also the “sandwich generation,” pressed between needs of their children and their aging parents who are living longer, but have health problems like Alzheimer’s or dementia.

Finally, economic woes may be to blame.

All this is unfolding in a lagging economy, meaning boomers could be affected by the “period effect.”

“One hypothesis is that the economic pressure during this period might be a driving force, with the recession in the early 2000s — loss of jobs, instability, increases in bankruptcy rates among middle age,” Phillips said.

Unemployment correlates with increased rates of suicide. People who are unmarried and have less education are also more at risk.