Photo by pj_vanf via flickr.com

The Twitterverse and blogosphere exploded after NFL replacement referees blew a call on Monday Night Football, costing the Green Bay Packers a victory. Even President Obama piped up on Tuesday, encouraging a swift end to the labor dispute between the NFL and its regular referees. Michael Hiltzik of the Los Angeles Times agrees that we should all be paying attention to the unfolding drama, but argues that we might be missing the point:

Most news coverage of this labor dispute focuses on the ineptitude of the fill-in referees; this week there will be a lot of hand-wringing over the flagrantly bad call that turned a Green Bay interception into a game-winning Seattle touchdown, as if by alchemy. Occasionally you’ll read that the disagreement has something to do with retirement pay. But it’s really about much more.

It’s about employers’ assault on the very concept of retirement security. It’s about employers’ willingness to resort to strong-arm tactics with workers, because they believe that in today’s environment unions can be pushed around (they’re not wrong). You ignore this labor dispute at your peril, because the same treatment is waiting for you.

One issue at the heart of the conflict is the NFL’s goal to end the referees’ pension plan and move to a 401(k)-style plan, which Hiltzik notes is not unique among U.S. employers.

NFL Commissioner Roger Goodell has argued that defined-benefit plans are a thing of the past — even he doesn’t have one, he told an interviewer recently, as though financially he’s in the same boat as any other league employee.

This is as pure an expression as you’ll find of the race to the bottom in corporate treatment of employees. Industry’s shift from defined-benefit retirement plans to 401(k) plans has helped to destroy retirement security for millions of Americans by shifting pension risk from employer to employee, exposing the latter to financial market meltdowns like those that occurred in 2000 and 2008.

It’s true that employers coast-to-coast have tried to put a bullet in the heart of the defined-benefit plan. The union representing 45,000 Verizon workers gave up on such coverage for new employees to settle a 15-month contract dispute.

But why anyone should sympathize with the desire of the NFL, one of the most successful business enterprises in history, to do so, much less admire its efforts, isn’t so clear. If you have one of these disappearing retirement plans today, don’t be surprised to hear your employer lament, “even the NFL can’t afford them” tomorrow.

Another common trend is an increase in the use of lockouts as a means of resolving labor disputes.

Lockouts have become more widespread generally: A recent survey by Bloomberg BNA found that as a percentage of U.S. work stoppages, lockouts had increased to 8.07% last year, the highest ratio on record, from less than 3% in 1991. In other words, work stoppages of all kinds have declined by 75% in that period — but more of them are initiated by employers.

The reasons are obvious. “Lockouts put pressure on the employees because nobody can collect a paycheck,” said William B. Gould IV, a former attorney for the National Labor Relations Board and a professor emeritus at Stanford Law School. “In a lot of major disputes, particularly in sports, it’s the weapon du jour.” Think about that the next time someone tells you that unions have too much power.

While the spotlight is sure to remain on the ire of fans and players alike toward the botched calls by replacement refs, America’s Game may be showing us more about business as usual in the United States than we would like to see.

It’s been said that football simply replicates the rough and tumble of the real world, and in this case, sadly, the observation is too true.

Poster by Mitch Rosenberg via zazzle.com
Poster by Mitch Rosenberg via zazzle.com

Think 47% of all Americans are moochers? Try 96%. Political scientists Suzanne Mettler and John Sides argue in the New York Times that Mitt Romney has grossly underestimated how many U.S. citizens take advantage of government social programs.

The beneficiaries include the rich and the poor, Democrats and Republicans. Almost everyone is both a maker and a taker.

Mettler and Sides draw on nationally representative data from a 2008 survey of Americans about their use of 21 different government social programs, including everything from student loans to Medicare.

What the data reveal is striking: nearly all Americans — 96 percent — have relied on the federal government to assist them. Young adults, who are not yet eligible for many policies, account for most of the remaining 4 percent.

On average, people reported that they had used five social policies at some point in their lives. An individual typically had received two direct social benefits in the form of checks, goods or services paid for by government, like Social Security or unemployment insurance. Most had also benefited from three policies in which government’s role was “submerged,” meaning that it was channeled through the tax code or private organizations, like the home mortgage-interest deduction and the tax-free status of the employer contribution to employees’ health insurance. The design of these policies camouflages the fact that they are social benefits, too, just like the direct benefits that help Americans pay for housing, health care, retirement and college.

The use of such government social programs cuts across all divides, including political party affiliation and class. But ideology does seem to play a role in how people think about their relationship with government programs.

…conservatives were less likely than liberals to respond affirmatively when asked if they had ever used a “government social program,” even when both subsequently acknowledged using the same number of specific policies.

These ideological differences have significant consequences for how government social programs either divide or unite us.

Because ideology influences how we view our own and others’ use of government, Mr. Romney’s remarks may resonate with those who think of themselves as “producers” rather than “moochers” — to use Ayn Rand’s distinction. But this distinction fails to capture the way Americans really experience government. Instead of dividing us, our experiences as both makers and takers ought to bind us in a community of shared sacrifice and mutual support.

For more from Suzanne Mettler on government social programs and the “submerged state,” check out our Office Hours Podcast.

The United States has a greater share of its population behind bars than any other nation. Yet this captive audience is almost never captured by large national surveys used to study the U.S. population. This might distort what we think we know about black progress in recent decades, the Wall Street Journal reports, because a large enough swath of the young African American male population is incarcerated and unaccounted for by these surveys.

Among the generally accepted ideas about African-American young-male progress over the last three decades that Becky Pettit, a University of Washington sociologist, questions in her book “Invisible Men“: that the high-school dropout rate has dropped precipitously; that employment rates for young high-school dropouts have stopped falling; and that the voter-turnout rate has gone up.

For example, without adjusting for prisoners, the high-school completion gap between white and black men has fallen by more than 50% since 1980, says Prof. Pettit. After adjusting, she says, the gap has barely closed and has been constant since the late 1980s. “Given the data available, I’m very confident that if we include inmates” in more surveys, “the trends are quite different than we would otherwise have known,” she says.

Voter turnout is another example, especially in light of this year’s presidential election.

…commonly accepted numbers show that the turnout rate among black male high-school dropouts age 20 to 34 surged between 1980 and 2008, to the point where about one in three were voting in presidential races. Prof. Pettit says her research indicates that instead the rate was flat, at around one in five, even after the surge in interest in voting among many young black Americans with Barack Obama in the 2008 race.

“I think that’s kind of stunning,” Prof. Pettit said.

Experts debate the feasibility of including prisoners in such surveys, as well as how to make the best use of available data. Even Pettit admits, “These are really, really tricky things.”

 

Photo by mtsofan on flickr.com
Photo by mtsofan on flickr.com

Welfare reform turned 16 years old this week and continues to grab headlines and garner controversy. Lately, assertions by Mitt Romney that President Obama is “gutting” welfare reform by removing the work requirement have fueled political debates and media fact-checking. As NPR reports, several fact-checking organizations have found Romney’s statements to be patently false, including a “four Pinocchios” rating from The Washington Post.

FactCheck.org explains:

“A Mitt Romney TV ad claims the Obama administration has adopted ‘a plan to gut welfare reform by dropping work requirements.’ The plan does neither of those things.”

“Work requirements are not simply being ‘dropped.’ States may now change the requirements — revising, adding or eliminating them — as part of a federally approved state-specific plan to increase job placement.”

“And it won’t ‘gut’ the 1996 law to ease the requirement. Benefits still won’t be paid beyond an allotted time, whether the recipient is working or not.”

Even Ron Haskins, a Republican architect and staunch supporter of welfare reform, contradicts Romney’s claims. He told NPR:

“There’s no plausible scenario under which it really constitutes a serious attack on welfare reform.”

Yet, these rumors persist and many people believe them. What could be driving this? Political scientist Martin Gillens, who wrote Why Americans Hate Welfare: Race, Media and the Politics of Antipoverty Policy, contends that race has something to do with it.

Gillens said his research shows that Americans think about welfare in a way that aligns pretty neatly with their perceptions about race. For example, whites tend to believe that most poor people are black. But actually, poor people are more likely to be white than black or Hispanic.

Gillens said it’s impossible to know whether the Romney campaign decided to play into a racial strategy or whether it’s an accident. But in a way, it doesn’t matter.

“Regardless of what their conscious motivations are, the impact of these kinds of attacks on welfare and, in particular, on the perceived lack of work ethic among welfare recipients, plays out racially and taps into Americans’ views of blacks and other racial stereotypes,” he said.

This, plus concern that Obama hopes to turn the United States into a “government-dependent society,” makes welfare reform the talk of the town during this year’s presidential race.

For more on welfare reform and race, see our Office Hours podcast with Joe Soss on “Poverty Governance” and our feature called “American Poverty Governance As It Is and As It Might Be.”

Photo by US Department of Labor's photostream on flickr.com
Atty General Holder visits Potomac Job Corps via flickr.com

Ask Mitt Romey and Barack Obama about the lingering high unemployment rate and they’ll likely cite a “skills mismatch” between American workers and available jobs as at least one part of the problem. Despite this striking point of agreement across the political spectrum, Barbara Kiviat argues in The Atlantic that social science data tell a different, much murkier, story.

Consensus over whether U.S. workers have the skills to meet employer demand has see-sawed over time.

That public discourse in the 1980s landed on the idea of a vastly under-skilled labor force is curious, considering that less than a decade earlier, policymakers believed that over-qualification was the main threat as technology “deskilled” work. In the 1976 book The Overeducated American, economist Richard Freeman held that the then-falling wage difference between high-school and college graduates was the result of a college-graduate glut. Sociologists studying “credentialism” agreed, arguing that inflated hiring requirements had led U.S. workers to obtain more education than was necessary. A 1973 report by the Department of Health, Education and Welfare ruminated about how to keep employees happy when job complexity increasingly lagged workers’ abilities and expectations for challenging jobs.

This simple story of a “skills mismatch,” regardless of whether workers are over- or under-qualified,  is not universally supported by data, depending on how you slice it. When looking at individual-level data, Kiviat notes:

The findings here are decidedly more nuanced. While certain pockets of workers, such as high-school drop-outs, clearly lack necessary skill, no nation-wide mismatch emerges. In fact, some work, such as an analysis by Duke University’s Stephen Vaisey, finds that over-qualification is much more common than under-qualification, particularly among younger workers and those with college degrees.

It seems that the heart of the matter really comes down to what story you want to tell about what kind of workers with what kind of skills, a much less neat and tidy task than painting a broad-stroked mismatch picture.

As sociologist Michael Handel points out in his book Worker Skills and Job Requirements, in the skills mismatch debate, it is often not clear who is missing what skill. The term is used to talk about technical manufacturing know-how, doctoral-grade engineering talent, high-school level knowledge of reading and math, interpersonal smoothness, facility with personal computers, college credentials, problem-solving ability, and more. Depending on the conversation, the problem lies with high-school graduates, high-school drop-outs, college graduates without the right majors, college graduates without the right experience, new entrants to the labor force, older workers, or younger workers. Since the problem is hazily defined, people with vastly different agendas are able to get in on the conversation–and the solution

It’s no surprise that the Great Recession has brought economic inequality front and center in the United States. The focus has been mostly problems in the the labor market, but Jason DeParle at the New York Times points out that other demographic changes have also had a sizable impact on growing inequality.

Estimates vary widely, but scholars have said that changes in marriage patterns — as opposed to changes in individual earnings — may account for as much as 40 percent of the growth in certain measures of inequality.

To illustrate how changes in family structure contribute to increasing inequality, DeParle turns to the research of several sociologists. One issue is the fact that those who are well off are more likely to get married.

Long a nation of economic extremes, the United States is also becoming a society of family haves and family have-nots, with marriage and its rewards evermore confined to the fortunate classes.

“It is the privileged Americans who are marrying, and marrying helps them stay privileged,” said Andrew Cherlin, a sociologist at Johns Hopkins University.

A related trend is the educational gap between women who have children in or out of wedlock.

Less than 10 percent of the births to college-educated women occur outside marriage, while for women with high school degrees or less the figure is nearly 60 percent.

This difference contributes to significant  inequalities in long-term outcomes for children.

While many children of single mothers flourish (two of the last three presidents had mothers who were single during part of their childhood), a large body of research shows that they are more likely than similar children with married parents to experience childhood poverty, act up in class, become teenage parents and drop out of school.

Sara McLanahan, a Princeton sociologist, warns that family structure increasingly consigns children to “diverging destinies.”

Married couples are having children later than they used to, divorcing less and investing heavily in parenting time. By contrast, a growing share of single mothers have never married, and many have children with more than one man.

“The people with more education tend to have stable family structures with committed, involved fathers,” Ms. McLanahan said. “The people with less education are more likely to have complex, unstable situations involving men who come and go.”

She said, “I think this process is creating greater gaps in these children’s life chances.”

As sociologists and others have shown, the income gap between those at the top and bottom has changed dramatically over time.

Four decades ago, households with children at the 90th percentile of incomes received five times as much as those at the 10th percentile, according to Bruce Western and Tracey Shollenberger of the Harvard sociology department. Now they have 10 times as much. The gaps have widened even more higher up the income scale.

But again, DeParle notes that marriage, rather than just individual incomes, makes a big difference:

Economic woes speed marital decline, as women see fewer “marriageable men.” The opposite also holds true: marital decline compounds economic woes, since it leaves the needy to struggle alone.

“The people who need to stick together for economic reasons don’t,” said Christopher Jencks, a Harvard sociologist. “And the people who least need to stick together do.”

For more on the Great Recession and inequality, check out our podcast with David Grusky.

Photo from Persephone's Birth by eyeliam on flickr.com
Photo from Persephone's Birth by eyeliam via flickr.com

The so-called “mommy wars” have apparently made it all the way to the delivery room, according to Jennifer Block, writing for Slate:

For a long time home birth was too fringe to get caught in this parenting no-fly zone, but lately it’s been fitting quite nicely into the mommy war media narrative: There are the stories about women giving birth at home because it’s fashionable, the idea that women are happy sacrificing their newborns for some “hedonistic” spa-like experience, or that moms-to-be (and their partners) are just dumb and gullible when it comes to risk management…

For many parents, home birth is a transcendent experience. …Yet as the number of such births grows, so does the number of tragedies—and those stories tend to be left out of soft-focus lifestyle features.

Debates about home birth have erupted in the media and the blogosphere in recent months, largely focused on the relative risks of home birth versus hospital birth. But at the heart of the issue is who, and what evidence, to trust.

I could list several recent large prospective studies… all comparing where and with whom healthy women gave birth, which found similar rates of baby loss—around 2 per 1,000—no matter the place or attendant. We could pick through those studies’ respective strengths and weaknesses, talk about why we’ll never have a “gold-standard” randomized controlled trial (because women will never participate in a study that makes birth choices for them), and I could quote a real epidemiologist on why determining the precise risk of home birth in the United States is nearly impossible. Actually, I will: “It’s all but impossible, certainly in the United States,” says Eugene Declercq, an epidemiologist and professor of public health at Boston University, and coauthor of the CDC study that found the number of U.S. home births has risen slightly, to still less than 1 percent of all births. One of the challenges is that “the outcomes tend to be pretty good,” Declercq says…But to really nail it down here in the U.S., he says, we’d need to study tens of thousands of home births, “to be able to find a difference in those rare outcomes.” With a mere 30,000 planned home births happening each year nationwide, “We don’t have enough cases.”

And, as sociologist Barbara Katz-Rothman notes, decisions about where to give birth are likely made more on the basis of perceived, rather than real, risk.

“What we’re talking about is felt risk rather than actual risk,” explains Barbara Katz-Rothman, professor of sociology at the City University of New York and author of much scholarship on birth, motherhood, and risk. Take our fear of flying. “Most people understand intellectually that on your standard vacation trip or business trip, the ride to and from the airport is more likely to result in your injury or death than the plane ride itself, but you never see anybody applaud when they reach the airport safely in the car.” The flight feels more risky. Similarly, we can look at data showing our risk of infection skyrockets the second we step in a hospital, “but there’s something about the sight of all those gloves and masks that makes you feel safe.”

Photo by Brian D. Hawkins via flickr.com
Photo by Brian D. Hawkins via flickr.com/briandhawkins.com

For the first time in about a century, new Census data reveal that population growth in big U.S. cities is exceeding that of the suburbs. According to the Associated Press (via Huffington Post):

Primary cities in large metropolitan areas with populations of more than 1 million grew by 1.1 percent last year, compared with 0.9 percent in surrounding suburbs. While the definitions of city and suburb have changed over the decades, it’s the first time that growth of large core cities outpaced that of suburbs since the early 1900s.

In all, city growth in 2011 surpassed or equaled that of suburbs in roughly 33 of the nation’s 51 large metro areas, compared to just five in the last decade.

Young adults forgoing homeownership and embracing the conveniences of urban life appear to be a driving force behind this trend.

Burdened with college debt or toiling in temporary, lower-wage positions, they are spurning homeownership in the suburbs for shorter-term, no-strings-attached apartment living, public transit and proximity to potential jobs in larger cities…They make up roughly 1 in 6 Americans, and some sociologists are calling them “generation rent.”

A related report from NPR further cites tougher mortgage rules since the housing bubble burst as an important factor.

Even with big drops in housing prices and interest rates, getting a mortgage has become a lot harder since the heady days of “no income, no assets” loans that fueled the housing boom of the early 2000s. Most lenders now require a rock-steady source of income and a substantial down payment before they will even look at potential borrowers. And many millennials won’t be able to reach that steep threshold.

The combination of stricter mortgage requirements, college loan debt, and a tough economy leaves sociologist Katherine Newman skeptical of young adults’ prospects for home ownership for the foreseeable future. From Huffington Post:

“Young adults simply can’t amass the down payments needed and don’t have the earnings,” she said. “They will be renting for a very long time.”

Small World
Photo by Steve Ransom via flickr.com

It seems a no-brainer that the internet, social media, and cellphones have made homesickness for migrants a thing of the past. But as historian Susan J. Matt reveals in a recent New York Times op-ed, previous generations have found technology no substitute for home sweet home, and today’s immigrants are no different.

More than a century ago, the technology of the day was seen as the solution to the problem. In 1898, American commentators claimed that serious cases of homesickness had “grown less common in these days of quick communication, of rapid transmission of news and of a widespread knowledge of geography.”

But such pronouncements were overly optimistic, for homesickness continued to plague many who migrated.

Today’s technologies have also failed to defeat homesickness even though studies by the Carnegie Corporation of New York show that immigrants are in closer touch with their families than before. In 2002, only 28 percent of immigrants called home at least once a week; in 2009, 66 percent did. Yet this level of contact is not enough to conquer the melancholy that frequently accompanies migration. A 2011 study published in the Archives of General Psychiatry found that Mexican immigrants in the United States had rates of depression and anxiety 40 percent higher than nonmigrant relatives remaining in Mexico. A wealth of studies have documented that other newcomers to America also suffer from high rates of depression and “acculturative stress.”

Then why does the idea that technology can overcome homesickness persist? Matt cites a pervasive belief about mobility that many hold despite its disappointments.

The global desire to leave home arises from poverty and necessity, but it also grows out of a conviction that such mobility is possible. People who embrace this cosmopolitan outlook assume that individuals can and should be at home anywhere in the world, that they need not be tied to any particular place. This outlook was once a strange and threatening product of the Enlightenment but is now accepted as central to a globalized economy.

Technology plays a role in supporting this outlook.

 The comforting illusion of connection offered by technology makes moving seem less consequential, since one is always just a mouse click or a phone call away.

Further, Matt argues that this illusion of connection may amplify homesickness rather than cure it.

The immediacy that phone calls and the Internet provide means that those away from home can know exactly what they are missing and when it is happening. They give the illusion that one can be in two places at once but also highlight the impossibility of that proposition.

The persistence of homesickness points to the limitations of the cosmopolitan philosophy that undergirds so much of our market and society. The idea that we can and should feel at home anyplace on the globe is based on a worldview that celebrates the solitary, mobile individual and envisions men and women as easily separated from family, from home and from the past. But this vision doesn’t square with our emotions, for our ties to home, although often underestimated, are strong and enduring.

 

Photo by Robert Schrader via flickr

While many turn up their noses at the thought, a recent article in the Star Tribune profiles a growing group of people who don headlamps and explore dumpsters for discarded edibles.

Some, calling themselves “freegans,” have a philosophy that shuns spending money and capitalism, and do it to protest waste.

Others just want to take advantage of free food.

The practice is rife with detractors, however, including food safety experts and most of the expiration date-abiding public. Taking food from dumpsters in public areas is not exactly against the law (at least no one has been prosecuted for it). Some cities, however, do have ordinances against dumpster diving, so most divers keep a low profile about their escapades.

Geographer Valentine Cadieux explains why such habits of food procurement might offend some:

 “Food is such a huge part of our lives, wrapped up in our identities and cultures and habits, not to mention survival — so we experience tremendous resistance to questioning the way we get this food,” Cadieux wrote in an e-mail.

While some dumpster divers may do it for practical reasons, like survival or cutting down on food costs, others might be looking to make a bigger statement.

“Dumpster divers are demonstrating a way to call into question something that seems really legitimate and scientific [expiration dates or the convenience of throwing away food],” Cadieux said. “The general guilt that we feel about how many people are hungry is exactly the kind of thing that adds additional meaning to what may not be intended as a part of a social movement — but dumpster diving ends up being legible to people as a critique of throwing away too much food.”

Though perhaps not looking to start a broader social movement, dumpster divers certainly make an impression. And, apparently, their exploits can make for a well-stocked fridge.

“All the produce, just tons of green peppers and red peppers; they looked perfect,” Graham recalled with not a small bit of awe. “This was the first time I was diving, and I couldn’t believe it.”