After the recent shock of a federal indictment of 29 Somali and Somali American individuals on sex trafficking charges, the New York Times reports on the Minnesota Somali community’s attempts to deal with the situation.

The allegations of organized trafficking, unsealed this month, were a deep shock for the tens of thousands of Somalis in the Minneapolis area, who fled civil war and famine to build new lives in the United States and now wonder how some of their youths could have strayed so far. Last week, in quiet murmurings over tea and in an emergency public meeting, parents and elders expressed bewilderment and sometimes outrage — anger with the authorities for not acting sooner to stop the criminals, and with themselves for not saving their young.

The indictment was the latest in a series of jolting revelations starting around 2007, when a spate of deadly shootings in the Twin Cities made it impossible to ignore the emergence of Somali gangs. Then came the discovery that more than 20 men had returned to Somalia to fight for Islamic extremists, bringing what many Somalis feel has been harsh and unfair scrutiny from law enforcement and the news media.

A sociologist weighs in on why this pattern of problems seems to be continuing:

Cawo Abdi, a Somali sociologist at the University of Minnesota, said that past surges in concern about troubled youths had not been followed up with money and programs to help them. “This is viewed as such a huge scandal and outrage,” she said of the new charges, “that it has to lead to some kind of action.”

Read the rest of the article for discussion of some of the challenges facing Somali people in the Twin Cities.

Mission accomplished! $20 worth of jalapeño cheetos
The phrase “you are what you eat” may refer to more than your physical make-up. In fact, the food in your fridge might say just as much about your social class as about your health.  Newsweek reports:

According to data released last week by the U.S. Department of Agriculture, 17 percent of Americans—more than 50 million people—live in households that are “food insecure,” a term that means a family sometimes runs out of money to buy food, or it sometimes runs out of food before it can get more money. Food insecurity is especially high in households headed by a single mother. It is most severe in the South, and in big cities. In New York City, 1.4 million people are food insecure, and 257,000 of them live near me, in Brooklyn. Food insecurity is linked, of course, to other economic measures like housing and employment, so it surprised no one that the biggest surge in food insecurity since the agency established the measure in 1995 occurred between 2007 and 2008, at the start of the economic downturn.

Growing inequality between the rich and the poor in the United States is reflected at the dinner table as well:

Among the lowest quintile of American families, mean household income has held relatively steady between $10,000 and $13,000 for the past two decades (in inflation-adjusted dollars); among the highest, income has jumped 20 percent to $170,800 over the same period, according to census data. What this means, in practical terms, is that the richest Americans can afford to buy berries out of season at Whole Foods—the upscale grocery chain that recently reported a 58 percent increase in its quarterly profits—while the food insecure often eat what they can: highly caloric, mass-produced foods like pizza and packaged cakes that fill them up quickly.

Using language evocative of sociologist Pierre Bourdieu, one epidemiologist explains:

Lower-income families don’t subsist on junk food and fast food because they lack nutritional education, as some have argued. And though many poor neighborhoods are, indeed, food deserts—meaning that the people who live there don’t have access to a well-stocked supermarket—many are not. Lower-income families choose sugary, fat, and processed foods because they’re cheaper—and because they taste good. In a paper published last spring, Drewnowski showed how the prices of specific foods changed between 2004 and 2008 based on data from Seattle-area supermarkets. While food prices overall rose about 25 percent, the most nutritious foods (red peppers, raw oysters, spinach, mustard greens, romaine lettuce) rose 29 percent, while the least nutritious foods (white sugar, hard candy, jelly beans, and cola) rose just 16 percent.

“In America,” Drewnowski wrote in an e-mail, “food has become the premier marker of social distinctions, that is to say—social class. It used to be clothing and fashion, but no longer, now that ‘luxury’ has become affordable and available to all.”

Concern about rising obesity, especially among low income communities, had led to some controversial policy proposals.

In recent weeks the news in New York City has been full with a controversial proposal to ban food-stamp recipients from using their government money to buy soda. Local public-health officials insist they need to be more proactive about slowing obesity; a recent study found that 40 percent of the children in New York City’s kindergarten through eighth-grade classrooms were either overweight or obese. (Nationwide, 36 percent of 6- to 11-year-olds are overweight or obese.)

But French sociologist Claude Fischler suggests that there might be a better way to address both food insecurity and obesity: Americans should be more French about food.

Americans take an approach to food and eating that is unlike any other people in history. For one thing, we regard food primarily as (good or bad) nutrition. When asked “What is eating well?” Americans generally answer in the language of daily allowances: they talk about calories and carbs, fats, and sugars. They don’t see eating as a social activity, and they don’t see food—as it has been seen for millennia—as a shared resource, like a loaf of bread passed around the table. When asked “What is eating well?” the French inevitably answer in terms of “conviviality”: togetherness, intimacy, and good tastes unfolding in a predictable way.

Even more idiosyncratic than our obsession with nutrition, says Fischler, is that Americans see food choice as a matter of personal freedom, an inalienable right. Americans want to eat what they want: morels or Big Macs. They want to eat where they want, in the car or alfresco. And they want to eat when they want. With the exception of Thanksgiving, when most of us dine off the same turkey menu, we are food libertarians. In surveys, Fischler has found no single time of day (or night) when Americans predictably sit together and eat. By contrast, 54 percent of the French dine at 12:30 each day. Only 9.5 percent of the French are obese.

Others suggest addressing systematic barriers to food accessibility and delivery. According to author and foodie icon Micahel Pollan:

“Essentially,” he says, “we have a system where wealthy farmers feed the poor crap and poor farmers feed the wealthy high-quality food.” He points to Walmart’s recent announcement of a program that will put more locally grown food on its shelves as an indication that big retailers are looking to sell fresh produce in a scalable way. These fruits and vegetables might not be organic, but the goal, says Pollan, is not to be absolutist in one’s food ideology. “I argue for being conscious,” he says, “but perfectionism is an enemy of progress.”

Community activists agree:

Food co-ops and community-garden associations are doing better urban outreach. Municipalities are establishing bus routes between poor neighborhoods and those where well-stocked supermarkets exist.

Joel Berg, executive director of the New York City Coalition Against Hunger, says these programs are good, but they need to go much, much further. He believes, like Fischler, that the answer lies in seeing food more as a shared resource, like water, than as a consumer product, like shoes. “It’s a nuanced conversation, but I think ‘local’ or ‘organic’ as the shorthand for all things good is way too simplistic,” says Berg. “I think we need a broader conversation about scale, working conditions, and environmental impact. It’s a little too much of people buying easy virtue.”re as well,” Berg says…

Berg believes that part of the answer lies in working with Big Food. The food industry hasn’t been entirely bad: it developed the technology to bring apples to Wisconsin in the middle of winter, after all. It could surely make sustainably produced fruits and vegetables affordable and available. “We need to bring social justice to bigger agriculture as well,” Berg says.

20101113_0834

Americans love marriage.  The wedding industry is worth over 40 billion dollars, and TV shows and magazines continually cover stories of romantic proposals, show us how women chose their wedding dresses, and highlight when and where celebrities tied the knot.  But, TIME recently confirmed a sociological story: marriage is changing.

In 1978, 28% of people surveyed thought that marriage was becoming obsolete.  Today, a new study conducted by the Pew Research Center and TIME revealed that 40% of people think it’s obsolete.

Even more surprising: overwhelmingly, Americans still venerate marriage enough to want to try it. About 70% of us have been married at least once, according to the 2010 Census. The Pew poll found that although 44% of Americans under 30 believe marriage is heading for extinction, only 5% of those in that age group do not want to get married. Sociologists note that Americans have a rate of marriage — and of remarriage — among the highest in the Western world. (In between is a divorce rate higher than that of most countries in the European Union.) We spill copious amounts of ink and spend copious amounts of money being anxious about marriage, both collectively and individually. We view the state of our families as a symbol of the state of our nation, and we treat marriage as a personal project, something we work at and try to perfect. “Getting married is a way to show family and friends that you have a successful personal life,” says Andrew Cherlin, a sociologist at Johns Hopkins University and the author of The Marriage-Go-Round: The State of Marriage and the Family in America Today. “It’s like the ultimate merit badge.”

This badge of merit has changed over the past few decades.  In 1960, 70% of American adults were married.  Now, about half are.  Also, wealthy, highly educated people are now more likely to get married/be married than those with lower levels of education and socioeconomic statuses.

The change is mostly a numbers game. Since more women than men have graduated from college for several decades, it’s more likely than it used to be that a male college graduate will meet, fall in love with, wed and share the salary of a woman with a degree. Women’s advances in education have roughly paralleled the growth of the knowledge economy, so the slice of the family bacon she brings home will be substantial.

These changes would suggest that the drive to finish college would explain why fewer people are married.  But, in the last two decades, people with a high school education began to get married later than college graduates.

What has brought about the switch? It’s not any disparity in desire. According to the Pew survey, 46% of college graduates want to get married, and 44% of the less educated do. “Fifty years ago, if you were a high school dropout [or] if you were a college graduate or a doctor, marriage probably meant more or less the same thing,” says Conley [a sociologist at New York University]. “Now it’s very different depending where you are in society.” Getting married is an important part of college graduates’ plans for their future. For the less well educated, he says, it’s often the only plan.

Promising publicly to be someone’s partner for life used to be something people did to lay the foundation of their independent life. It was the demarcation of adulthood. Now it’s more of a finishing touch, the last brick in the edifice, sociologists believe. “Marriage is the capstone for both the college-educated and the less well educated,” says Johns Hopkins’ Cherlin. “The college-educated wait until they’re finished with their education and their careers are launched. The less educated wait until they feel comfortable financially.

And as they wait, they are increasingly likely to pass the time under the same roof.

Cohabitation is on the rise not just because of the economy. It’s so commonplace these days that less than half the country thinks living together is a bad idea. Couples who move in together before marrying don’t divorce any less often, say studies, although that might change as the practice becomes more widespread. In any case, academic analysis doesn’t seem to be as compelling to most people as the example set by Angelina Jolie and Brad Pitt. Or as splitting the rent.

But,

“Marriage is still the way Americans tend to do long-term, stable partnerships,” says Cherlin. “We have the shortest cohabiting relationships of any wealthy country in the world. In some European countries, we see couples who live together for decades.” To this day, only 6% of American children have parents who live together without being married.

This story is further nuanced by differences in class and views about what is best for children.  Check out the full article!

Istanbul 2010 - A Panasonic Lumix TripFor many Istanbul stands as a symbol of success. It’s growing status as a ‘global city’ and a European Capital of Culture has attracted tourists, foreign investments, and massive development projects. Luis Gallo’s recent article in the Hürriyet Daily News provides a reminder that with development and prosperity there are rarely winners without losers.

[I]n the shadow of those skyscrapers, there is another Istanbul, a little-seen realm where the urban poor are coming face-to-face with the bulldozers clearing ground for the sparkling new city. The neighborhood of Sulukule, perhaps the world’s oldest Roma community, is already flattened, with just a few holdouts living amid the rubble.

This raises difficult questions as development continues.

With massive amounts of money, and the city’s international reputation, at stake, fierce debate is raging over the government’s “urban transformation” programs: They may be beautifying and enriching the city, but at what social cost?

Critics are quick to point to the increasing inequality that ‘success’ is bringing. Ozan Karaman, an urban-geography scholar from the University of Minnesota, explains

“Lack of representation will result in further marginalization of the urban poor and perhaps the emergence of a new type of poverty, in which the poor have no hope whatsoever for upward mobility and are in a state of permanent destitution.”

Tansel Korkmaz and Eda Ünlü-Yücesoy, professors of architectural design at Istanbul Bilgi University, argue that the government ignoring the plight of the poor is not simply an unexpected result of development. Instead, they claim that the government’s goal is to to hide the urban poor in 21st-century Istanbul.

“The following statement by Prime Minister Recep Erdoğan about the neighborhoods of the urban poor summarizes the essence of the official approach: ‘cancerous district[s] embedded within the city.’ Planning operations in Tarlabaşı, Fener-Balat and Sulukule are [intended] to move the urban poor to the outskirts of the city and to make available their inner-city locations for big construction companies for their fancy projects,” Korkmaz said.

Recently, in the rapidly changing Tophane neighborhood in Istanbul’s Beyoğlu district, dozens of people attacked a crowd attending an opening of art galleries. The  violence is a sign that frustration over being displaced in the name of gentrification has finally boiled over and is likely not a one time occurrence.

Experts say clashes between newcomers and longtime residents could become more frequent if people feel they have no say in the transformation of their neighborhoods and believe they must resort to violence in order to make their voices heard.

Even with the increasing tension, Ozan Karaman manages to hold onto hope while remaining critical of the current development approach.

“Urban redevelopment projects should be executed in collaboration with citizens and residents, not despite them. There is no need to re-invent the wheel; there are plenty of models of community-based development that have been successful since the 1970s.”

EpicThe hipster is a difficult group to define for those that seem to be the most exemplary examples of the term are also the most offended by the label.

A year ago Mark Greif, a professor in Literary Studies at the New School, and his colleagues began their investigation of the ‘hipster’.  In a recent essay in the NY Times, Greif reflects upon some of their findings  and explains how Pierre Bourdieu’s masterwork, Distinction: A Social Critique of the Judgement of Taste, provides a base to understand the meaning of ‘hipster’.

In conducting the study, Greif was immediately surprised by the intense emotions and self-doubt that seemingly superficial topic generated.

The responses were more impassioned than those we’d had in our discussions on health care, young conservatives and feminism. And perfectly blameless individuals began flagellating themselves: “Am I a hipster?

Greif turns to Bourdieu – A French sociologist who died in 2002 at the age 71 after achieving a level of fame and public interest rarely obtained by academics –  to help us understand why so much seems to be stake. While Bourdieu’s biographical details provide little connection to people wearing skinny black jeans and riding fixed-gear bikes, his account of the way what people consume becomes a means of separating themselves from other groups provides the framework to study the rise of the hipsters.

Taste is not stable and peaceful, but a means of strategy and competition. Those superior in wealth use it to pretend they are superior in spirit. Groups closer in social class who yet draw their status from different sources use taste and its attainments to disdain one another and get a leg up. These conflicts for social dominance through culture are exactly what drive the dynamics within communities whose members are regarded as hipsters.

From this perspective the coffee shops, bars, and Roller Derby track become the sites of social struggle.

Once you take the Bourdieuian view, you can see how hipster neighborhoods are crossroads where young people from different origins, all crammed together, jockey for social gain.

The main strategy in this competition is to establish yourself as being more ‘authentic’ than everyone else.

Proving that someone is trying desperately to boost himself instantly undoes him as an opponent. He’s a fake, while you are a natural aristocrat of taste. That’s why “He’s not for real, he’s just a hipster” is a potent insult among all the people identifiable as hipsters themselves.

This does not only apply to people with ironic mustaches.

Many of us try to justify our privileges by pretending that our superb tastes and intellect prove we deserve them, reflecting our inner superiority. Those below us economically, the reasoning goes, don’t appreciate what we do; similarly, they couldn’t fill our jobs, handle our wealth or survive our difficulties. Of course this is a terrible lie.

Rally to Restore Sanity - 8729 Sociologist R. Tyson Smith recently published an op-ed in the Philadelphia Inquirer about what he calls a growing gap between military veterans and civilians.

Following the retirement of Justice John Paul Stevens, the U.S. Supreme Court began its first term since at least World War II without a veteran on the bench. Meanwhile, in Connecticut, Senate candidate Richard Blumenthal was caught fabricating military experience in Vietnam.

These stories reflect a paradox: American civilians continue to love what veterans represent – duty, sacrifice, strength, leadership – but we have less and less true understanding of the veteran experience. Although the United States is in the 10th year of a war, veterans have become increasingly marginalized, accounting for a dwindling share of middle-class and public life.

Smith calls this the “quiet disappearance” of veterans from prominent positions in public life.

So, what’s causing this change? Smith points to the ongoing downsizing of the U.S. military and the lack of the draft.

During World War II, 16 million troops were mobilized at a time when the U.S. population was roughly 140 million. In the Vietnam era, about 3.5 million were deployed (and more than 6 million served) when the population was nearly 200 million. Today, fewer than 2 million service members have been deployed to Iraq and Afghanistan from a population of more than 310 million.

So, since World War II, the proportion of the populace deployed to a recent war has dropped from about 11 percent to less than 1 percent. No wonder veterans refer to themselves as “the less than 1 percent.”

Citing evidence from political science, Smith also notes that socioeconomic status plays a role.

In their book The Casualty Gap, political scientists Douglas Kriner and Francis Shen show that since the Korean War, poorer communities have suffered a disproportionate share of the nation’s wartime casualties. Given this trend, it’s hardly surprising that the number of members of Congress with children serving in combat is in the single digits.

This gap matters for two reasons, according to Smith.

First, veterans and the consequences of war are much easier to ignore when those fighting come from marginalized communities. Many Americans are insulated from the people we expect to die on our behalf.

Second, veterans’ successful readjustment to civilian life depends on social support. Such support usually begins with sharing the war experience and interacting with sympathetic civilians. This is harder to do as vets become more alienated and isolated.

Facebook

The website that many of you will visit after this one (or may have already visited!) is providing sociologists with new research opportunities.

Andreas Wimmer and Kevin Lewis used facebook to study friendships among college freshman and found that race’s impact on friendships may be overstated.

“Sociologists have long maintained that race is the strongest predictor of whether two Americans will socialize,” says lead author Andreas Wimmer, professor of sociology at UCLA. “But we’ve found that birds of a feather don’t always flock together. Whom you get to know in your everyday life, where you live, and your country of origin or social class can provide stronger grounds for forging friendships than a shared racial background.”

To reach these conclusions, Wimmer and Lewis studied the social networks of college freshmen by examining tagged photos on facebook.

True to past research, the sociologists initially saw same-race friendships develop rapidly: White students befriended each other one-and-a-half times more frequently than would be expected by chance, Latino students befriended each other four-and-a-half times more frequently, and African American students befriended each other eight times more frequently. But when the researchers dug deeper, race appeared to be less important than a number of other factors in forging friendships.

“Much of what at first appeared to be same-race preference, for instance, ultimately proved to be preference for students of the same ethnic background,” Lewis says. “Once we started controlling for the attraction of shared ethnic backgrounds or countries of origin, the magnitude of racial preference was cut almost in half.”

While Wimmer and Lewis stress that racial discrimination is still a problem, they believe past research may have exaggerated the role of race in social relationships.  Instead, social and physical constraints play a bigger role.

To read the entire article, click here.

Ale

Many of our posts focus on colleagues’ research, but teaching is also newsworthy.  The student newspaper at the University of South Carolina recently ran an article about an upcoming sociology course at USC—Lady Gaga and the Sociology of Fame.

The professor, Mathieu Deflem, explained the idea behind the course:

“We’re going to look at Lady Gaga as a social event,” Deflem said. “So it’s not the person, and it’s not the music. It’s more this thing out there in society that has 10 million followers on Facebook and six million on Twitter. I mean, that’s a social phenomenon. It’s a global social phenomenon. So the central question of the course is, this fame, which is ironically also the theme of her first records, how can it be accounted for? What are some of the mechanisms and some of the conditions of Lady Gaga’s rise to popularity?”

Deflem added that another key question of the course is, “What does it mean, and how does a person become famous?”

Deflem usually teaches criminology, the sociology of law, and policing, but he is excited to examine the popular social phenomenon in a sociological light.

In the beginning, the course will deal with the sociology of popularity in general. The first couple weeks probably won’t be about Lady Gaga at all. But then the Gaga scenario will be used as a real-life example detailing sociological traits. More specific information about the course content can be found at gagacourse.net – a site Deflem has already created for the class.

Wild Card WeekendRecent medical reports on the long-term effects of head injuries have resulted in increased concern about the medical risks of participating in football. While the N.F.L. has increasingly shown concern over the safety of its players, a solution has not been found. The safety issues came to a head this past Sunday when a number of players were injured as a result of highlight reel hits.

Michael Sokolove’s article in the New York Times examines the moral issues surrounding consuming a sport where players place themselves at such a high risk. As medical studies continue to build the link between head injuries in football and depression, suicide, and early death, Sokolove asks the timely question:

Is it morally defensible to watch a sport whose level of violence is demonstrably destructive? (The game, after all, must conform to consumer taste.) And where do we draw the line between sport and grotesque spectacle?

To provide insight into the question Sokolove turns to a series of cultural theorists and philosophers who have interest in the role of violent pursuits in society.

The writer Joyce Carol Oates has written admiringly of boxing, celebrating, among other aspects, the “incalculable and often self-destructive courage” of those who make their living in the ring. I wondered if she thought America’s football fans should have misgivings about sanctioning a game that, like boxing, leaves some of its participants neurologically impaired.

“There is invariably a good deal of hypocrisy in these judgments,” Ms. Oates responded by e-mail. “Supporting a war or even enabling warfare through passivity is clearly much more reprehensible than watching a football game or other dangerous sports like speed-car racing — but it may be that neither is an unambiguously ‘moral’ action of which one might be proud.”

Other ‘experts’ argue that the dangerous activity may serve a communal goal.

“We learn from dangerous activities,” said W. David Solomon, a philosophy professor at Notre Dame and director of its Center for Ethics and Culture. “In life, there are clearly focused goals, with real threats. The best games mirror that. We don’t need to feel bad about not turning away from a game in which serious injuries occur. There are worse things about me than that I enjoy a game that has violence in it. I don’t celebrate injuries or hope for them to happen. That would be a different issue. That’s moral perversion.”

Fellow philosopher Sean D. Kelly, the chairman of Harvard’s philosophy department, shares Solomon’s emphasis on the potential positive value of sports:

“You can experience a kind of spontaneous joy in watching someone perform an extraordinary athletic feat,” he said when we talked last week. “It’s life-affirming. It can expand our sense of what individuals are capable of.”

He believes that it is fine to watch football as long as the gravest injuries are a “side effect” of the game, rather than essential to whatever is good about the game and worth watching.

Sokolove concludes with the difficult question that football fans, as well as organizers and sponsors of the sport at all levels, must now ask themselves:

But what if that’s not the case? What if the brain injuries are so endemic — so resistant to changes in the rules and improvements in equipment — that the more we learn the more menacing the sport will seem?

Montréal-Nord

Patricia Cohen’s recent article in the NY Times, “‘Culture of Poverty’ Makes a Comeback,” documents culture once again being used by social scientists as an explanation in discussing poverty.

Cohen begins by setting the historical context.

The reticence was a legacy of the ugly battles that erupted after Daniel Patrick Moynihan, then an assistant labor secretary in the Johnson administration, introduced the idea of a “culture of poverty” to the public in a startling 1965 report. Although Moynihan didn’t coin the phrase (that distinction belongs to the anthropologist Oscar Lewis), his description of the urban black family as caught in an inescapable “tangle of pathology” of unmarried mothers and welfare dependency was seen as attributing self-perpetuating moral deficiencies to black people, as if blaming them for their own misfortune.

The idea was soon central to many of the conservative critiques of government aid for the needy. Within the generally liberal fields of sociology and anthropology the argument was generally treated as being in poor taste and avoided. This time of silence seems to be drawing to a close.

“We’ve finally reached the stage where people aren’t afraid of being politically incorrect,” said Douglas S. Massey, a sociologist at Princeton who has argued that Moynihan was unfairly maligned.

The new wave of culture-oriented discussions is not a direct replica of the studies of the 1960s.

Today, social scientists are rejecting the notion of a monolithic and unchanging culture of poverty. And they attribute destructive attitudes and behavior not to inherent moral character but to sustained racism and isolation.

Cohen continues by providing examples of how culture is now being examined. To do so she turns to Harvard sociologist, Robert J. Sampson. According to Sampson culture should be understood as “shared understandings.”

The shared perception of a neighborhood — is it on the rise or stagnant? — does a better job of predicting a community’s future than the actual level of poverty, he said.

William Julius Wilson, a fellow Harvard sociologist who achieved notoriety through studies of persistent poverty defines culture as the way

“individuals in a community develop an understanding of how the world works and make decisions based on that understanding.”

For some young black men, Professor Wilson said, the world works like this: “If you don’t develop a tough demeanor, you won’t survive. If you have access to weapons, you get them, and if you get into a fight, you have to use them.”

As a result of this new direction in the study of poverty, a number of assumptions about people in poverty have been challenged. One of these is idea marriage is not valued by poor, urban single mothers.

In Philadelphia, for example, low-income mothers told the sociologists Kathryn Edin and Maria Kefalas that they thought marriage was profoundly important, even sacred, but doubted that their partners were “marriage material.” Their results have prompted some lawmakers and poverty experts to conclude that programs that promote marriage without changing economic and social conditions are unlikely to work.

The question remains, why are social scientists suddenly willing to deal with this once taboo approach?

Younger academics like Professor Small, 35, attributed the upswing in cultural explanations to a “new generation of scholars without the baggage of that debate.”

Scholars like Professor Wilson, 74, who have tilled the field much longer, mentioned the development of more sophisticated data and analytical tools. He said he felt compelled to look more closely at culture after the publication of Charles Murray and Richard Herrnstein’s controversial 1994 book, “The Bell Curve,” which attributed African-Americans’ lower I.Q. scores to genetics.

The authors claimed to have taken family background into account, Professor Wilson said, but “they had not captured the cumulative effects of living in poor, racially segregated neighborhoods.”

He added, “I realized we needed a comprehensive measure of the environment, that we must consider structural and cultural forces.”

This surge of interest is particularly timely as poverty in the United States has hit a fifteen-year high. And the debate is by no means confined to the ‘Ivory Tower’.

The topic has generated interest on Capitol Hill because so much of the research intersects with policy debates. Views of the cultural roots of poverty “play important roles in shaping how lawmakers choose to address poverty issues,” Representative Lynn Woolsey, Democrat of California, noted at the briefing.