5789168673_be6cd1f5b4_o

Photo by Clemens v. Vogelsang, Flickr CC.

Economic inequality is growing in America and economic mobility is declining. Most observers agree these are worrisome trends, but there is no consensus about solutions. In highly polarized public debates, some say that the government’s role should be limited to ensuring equal opportunity. Others emphasize the need for policies to encourage more equality. We believe that these extremes present a false dichotomy; a middle ground is possible. By looking for ways to further “school readiness,” citizens and policymakers can come together in support of giving all children what they need to take advantage of opportunities to learn and prepare for success in later life.

To Reduce Future Economic Inequality, Ensure That Children Succeed in School

Policy leaders from both sides of the aisle should be able to agree that can young children need to gain the basic attitudes, skills and knowledge required to succeed in school. Children will not enter the labor market on equal terms if stark inequalities begin to hold them back even before age five. Many children who are not ready for school cannot realize their potential – and that makes no economic sense for our country. Failure at school leads to losses of income and tax revenues as well as higher costs for social services, policing, and incarceration. Put another way, differences in school readiness influence kids’ capacities to take advantage of opportunities – and contribute to society – over the course of their whole lives.

Can we address educational disparities simply by make schools themselves more equal? If all schools had the same financial resources and the same number of teachers for every 100 students, and if all teachers were well-compensated and well-motivated and trained, would students from all groups have the same chance to perform well? The answer is no, according to available research. Even school systems of comparable quality cannot produce fair results when some children arrive at the first day of class unprepared to learn. To be ready to learn, youngsters must already have certain language, motor and social skills, and they must be able to pay attention, follow directions, and control their emotions. A lot must happen before the first day of kindergarten. more...

Photo by Rebecca Krebs via Flickr

Photo by Rebecca Krebs via Flickr

The share of births to unmarried women in the United States has almost doubled over the last 25 years, going from 22% of births in 1985 to 41% in 2010. These are not just teenagers or older women having babies on their own. Parents who are living together but not married account for much of the overall increase in births to unmarried women, especially in the last decade.  

For babies and children growing up, living with two cohabiting parents in many ways resembles living with two married parents. There are two potential earners contributing to the economics of the household and two potential care-givers. But we cannot just assume that cohabitation and marriage are the same, because couples who have a child while living together are more likely to separate at a later point than married couples who have a child. Furthermore, researchers have found that children’s wellbeing can be undermined when the living arrangements of their parents change.

To draw meaningful conclusions about the impact of rising childbearing among cohabiting couples, we need to learn more about whether cohabiting families are becoming more or less stable over time. Our research focuses specifically on couples who have had a child together. These couples express high hopes that their relationships will last, but what actually happens and with what consequences for their children? We used nationally representative survey data from the 1990s and 2000s to examine changes in the stability of married families, cohabiting families where marriages do not happen, and cohabiting families where parents marry around the time a child arrives. more...

IMG_0849-4

 

Health care providers who perform abortions routinely use ultrasound scans to confirm their patients’ pregnancies, check for multiple gestations, and determine the stage of the pregnancies. But it is far from standard – and not at all medically necessary – for women about to have abortions to view their ultrasounds. Ultrasound viewing by patients has no clinical purpose: it does not affect the woman’s condition or the decisions health providers make. Nevertheless, ultrasound viewing has become central to the hotly contested politics of abortion.

Believing that viewing ultrasounds will change minds, opponents of abortion – spearheaded by the advocacy group Americans United for Life – have pushed for state laws to require such viewing. So far, eighteen states require that women be offered the opportunity to view their pre-abortion ultrasound images, and five states actually go so far as to legally require women to view their ultrasound images before obtaining an abortion (although the women are permitted to avert their eyes). In two of the five states that have passed such mandatory viewing laws, courts have permanently enjoined the laws, keeping them from going into effect.

As the debates continue to rage, both sides assume that what matters for an abortion patient is the content of the ultrasound image. Abortion opponents believe the image will demonstrate to the woman that she is carrying a baby – a revelation they think will make her want to continue her pregnancy. Ironically, supporters of abortion rights also argue that seeing the image of the fetus will make a difference. They say this experience will be emotionally distressing and make abortions more difficult. Paradoxically, such arguments from rights advocates reinforce assumptions that fetuses are persons and perpetuate stigma about abortion procedures. more...

Eric Garner Protest 4th December 2014, Manhattan, NYC

A Black Lives Matter protestor in New York City in 2014. Black youth suicide is increasing due to what researchers describe as a complex web of factors including greater exposure to poverty and violence and a lack of access to mental health treatment. Photo by The All Nite Images via Flickr CC.

The suicide of a young person is always a tragedy, an event deeply mourned by the youth’s family and community. Sadly, the prevalence of this kind of tragedy is greater than many might think. Data from the Centers for Disease Control and Prevention for 2005 through 2013 indicate suicide has been the third or fourth leading cause of death for people ages 10 to 14, and the second or third leading cause of death for young people ages 15 to 24. Within these age groups, suicide rates can be further differentiated by race. Although suicide incidence has tended to be lower for Black youth than for other demographic groups, today suicides of African American children and young adults are on the rise. In order to understand how to reverse this worrisome new trend, a complex set of factors need to be examined.

Suicide by Race, Age, and Gender

In the total U.S. population not taking age into account, whites have the highest rate of suicide followed by American Indians and Alaskan Natives, and Asians and Pacific Islanders. Blacks and Hispanics have the lowest rates of suicide. For young people, the highest rates of suicide-related deaths occur among American Indians and Alaskan Natives. Rates of suicide for Black youth and the overall Black population tend to be lower than these other demographics – but things have changed recently. more...

A protest for the rights of fast food workers at the University of Minnesota, April 15th, 2015 Photo by Fibonacci Blue

A protest for the rights of fast food workers at the University of Minnesota, April 15th, 2015
Photo by Fibonacci Blue via Flickr.com

And What Can Be Done To Make Jobs and Family Life More Predictable

For decades, work-family activists have pressed for policies to give workers flexibility. Some workers, most of them relatively affluent, have seen gains. They have won the ability to adjust their schedules, to choose how many days a week to work, and even to work from home. But as my colleague Dan Clawson and I document in our new book, Unequal Time, many employers in the United States are turning the concept of work schedule flexibility on its head. For employers using disturbing new tactics, “flexibility” means that employees – especially low-wage workers – must come in whenever the boss wants and can be sent home whenever demand is slack.

Unpredictable Schedules and Insufficient Hours 

News stories have featured the chaotic schedules of young people working in retail, cleaning, and fast food jobs – many of whom must come to work with just one day’s notice or work split shifts. About a third of young adults do not know their schedule more than a week in advance. But similar problems are faced by workers of all ages. Unpredictable schedules are becoming the new normal for many U.S. employees – ranging from low-wage nursing assistants to well-paid physicians. In the retail and health care sectors, many workers must call in the night before to find out if they will be needed – and if they will earn the wages they have counted on getting. At a nursing home we studied, for example, one out of three shifts turned out to be different from the official schedule planned in advance. more...

Steven Depolo, Flickr Creative Commons

Steven Depolo, Flickr Creative Commons

In early June 2015, the Missouri state legislature voted to remove thousands of families, including 6,400 children, from the state’s cash assistance program for the poor. The new law reduces the state lifetime limit for Temporary Assistance for Needy Families from 60 to 45 months, cuts cash benefits in half for those who do not work, and redirects a significant portion of welfare funds toward programs that encourage marriage and alternatives to abortion.

Why has Missouri made these changes now? Since the U.S. Congress acted in 1996 to change welfare funding rules and give states greater discretion, many states have taken steps similar to Missouri. My research suggests that racial dynamics drive these cutbacks – but not in ways many suppose. Demography and attitudes are insufficient explanations; the political context matters.

Race and Welfare Policymaking

Why have some states imposed welfare restrictions in recent years while others have retained more generous programs? Previous studies reveal a clear pattern: the higher the proportion of African Americans receiving cash welfare benefits, the more likely states are to adopt restrictive welfare policies. But not all racially diverse states have adopted punitive reforms and some predominantly white states have taken very restrictive approaches to welfare. Race clearly influences welfare politics, but how?

To answer this question, I examined the policy decisions that state legislatures made immediately after the 1996 national reforms transformed American anti-poverty policy. That law imposed new time limits, work requirements, and penalties on recipients of welfare benefits. After Congress gave states new flexibility to design their own programs, some states adopted the most generous policies allowed by federal law, while others imposed far more restrictive policies. To understand the decision-making processes better, I closely examined a number of states which had large minority populations at the time.

What I found was surprising: Legislators’ decisions about welfare policy were heavily influenced by the political debates simultaneously raging in their states. When these other debates were rife with racial tensions, legislators enacted punitive welfare reforms. But when coterminous debates were not racialized, lawmakers tended to adopt more generous welfare programs. In other words, lawmakers used restrictive welfare changes as a strategy to appease white voters who felt threatened by other racial conflicts happening in the same period. more...

A facetious gun control ad near Boston's Fenway Park. Photo by Jason Paris via flickr.com.

A facetious gun control ad near Boston’s Fenway Park. Photo by Jason Paris via flickr.com.

Ten years ago, the state of Florida beefed up its “stand your ground” law – a law allowing a person who harms or kills another, often with a gun, to escape prosecution by claiming that he or she felt threatened and acted in self-defense. In other words, Florida’s law – and many others like it – lets assailants go free merely by asserting their belief that the use of force was necessary to prevent serious harm or death to themselves or bystanders. Those who assert such beliefs become according to Florida law “immune from criminal prosecution and civil action.” Prosecutions are not entirely ruled out, but authorities must meet very difficult standards to pursue cases.

Since 2005, about half of all U.S. states have passed Florida-style laws, or very similar ones. The National Rifle Association has led the charge, arguing that stand your ground laws will improve public safety and protect honest citizens.

By now, however, there is clear and compelling evidence that such laws have failed to improve public safety – and have encouraged mayhem reminiscent of America’s old Wild West. Laws allowing claims of self-defense have existed for over a century, but Florida’s new law and its imitators dramatically alter the law enforcement equation. According to David LaBahn of the Association of Prosecuting Attorneys, investigations of civilian killings are now often hamstrung by legal protections greater than those afforded police officers who use lethal force.

The Florida Experience

Florida’s 2005 law was invoked in nearly 200 shooting cases through 2012 – a majority of them involving fatalities. The cases were documented by the Tampa Bay Times:

  • The Florida law’s chief beneficiaries were “those with records of crime and violence.” Nearly 60 percent of those making self-defense claims after killing someone had been arrested at least once before; a third had been accused of violent crimes or drug offenses; and over one-third had illegally carried guns or had threatened others with guns.
  • In seven of every ten stand your ground cases, the person killed was unarmed – and in 79 percent of the cases, the assailant could have retreated to avoid the confrontation.
  • Shooters who invoked stand your ground claims under Florida’s 2005 law succeeded in escaping prosecution two-thirds of the time.

guns across americaSimilar Trends in All Stand Your Ground States

Moving beyond Florida alone, other studies have documented equally worrisome trends:

  • Reporters at the Wall Street Journal studied “justifiable homicides” nationwide from 2000 to 2010. They found that such killings increased by 85 percent in states with Florida-style laws (even though some states have more limited versions of stand your ground rules on the books). The increase occurred even though overall killings, adjusted for population growth, declined during this same period. According to the Journal investigation, more than 80 percent of the “justifiable” killings involved guns, compared with 65 percent of other killings where claims of justification were not made.
  • For the same period, researchers at Texas A&M University found no evidence in data from the Federal Bureau of Investigations that stand your ground laws deterred crimes, including burglary, robbery, or aggravated assault. Instead, in states with newly buttressed stand your ground laws on the books, the homicide rate increased by eight percent – which in human terms added up to about 600 additional homicides annually.
  • Drawing on different data, a 2012 National Bureau of Economic Research study found Florida-type laws associated with a 6.8 percent increase in homicides.
  • An Urban Institute study found significant racial disparities in “justified” killings between 2005 and 2010. In states without stand your ground laws, killings were ruled justified in 29 percent of instances where the shooter was white and the victim was black (with much lower rates of justification for white on white, black on white, and black on black killings). By contrast, in states with stand your ground laws on the books, white on black killings were accepted as justified 36 percent of the time (with more modest upticks in findings of justification for the other kinds of cases).

Time to Rethink Laws Undermining Public Safety

The evidence is clear: Expanded stand your ground laws combined with more gun-carrying increases unnecessary violent confrontations and deaths. With more than 11 million Americans now licensed to carry guns, we need policies to defuse or avert public confrontations – and police and prosecutors must be able to conduct full investigations when incidents occur. A February 2015 American Bar Association report urges states to scale back legal immunity and restore the “safe retreat” standard in public places – a standard that requires people who feel threatened to avoid confrontation if they can do so safely. Since the beginning of 2015, legislators in ten states, including Florida, have introduced such measures. But many reform proposals are stalled, and 13 states are actually deliberating bills that would fortify stand your ground practices.

Long ago, Americans north and south acted to contain the dangers of open gun-toting and free-wheeling confrontations. As early as 1686, New Jersey enacted a law against wearing weapons because they induced “great Fear and Quarrels.” In the 1700s, three states passed no-carry laws. In the 1800s, as interpersonal violence and gun carrying spread, 37 states joined the list of those enacting restrictions. Alabama’s 1839 law was titled, “An Act to Suppress the Evil Practice of Carrying Weapons Secretly.” This history makes the current popularity of gun-carrying and stand your ground laws all the more mystifying. Apparently, twenty-first century Americans must now re-learn lessons their ancestors took to heart long ago.

This brief was prepared for the Scholars Strategy Network by Robert J. Spitzer, State University of New York at Cortland. Spitzer is the author of Guns across America: Reconciling Gun Rules and Rights (Oxford University Press, 2015).

usdeportations

In recent decades, the United States has seen a spectacular rise in deportations, with local police forces authorized by the federal government to identify undocumented immigrants for summary removal. More than 11 million undocumented people across the country – including up to one in ten adult workers in the state of California – faced this threat in their daily lives.

To assuage the human costs, President Barack Obama outlined a plan in November 2014 to provide temporary protection to many undocumented migrants. Building on his earlier efforts to set priorities, the President specified that officials would henceforth seek to deport “felons, not families,” “criminals, not children,” “gang members, not a mom who’s working hard to provide for her kids.” In short, under the new policy, various kinds of immigrants deemed good would be protected from deportation. Well-intentioned city leaders, bureaucrats, and police would need to sort out the good immigrants from those vilified as criminals.

These well-intended steps are meant to alleviate the trauma that the threat of deportation has imposed on millions of law-abiding migrants. But how do the binary divisions work out in practice? My research, based on a year of observations in southern California plus 75 in-depth interviews with undocumented Mexican migrants, suggests that efforts to divide good from bad people in migrant communities can have pernicious as well as helpful effects. more...

Photo by Francisco Osorio Flickr CC

Photo by Francisco Osorio Flickr CC

Latinos living in the United States comprise the largest number of immigrants of any racial or ethnic group – and for this reason, many Americans presume that immigration is the issue that matters most to Latino citizens and residents. But is that true? Do Latinos themselves view immigration as their top concern, and if not what other issues are high on their political agenda? My research tackles this question, which is important for understanding the potential political influence of the largest and fastest growing minority group in the United States. more...

The Live Below the Line Campaign encourages people around the world to try to live on a poverty-level wage.

The Live Below the Line Campaign encourages people around the world to try to live on a poverty-level wage.

 

Poverty is commonly explained as a matter of joblessness, while work for wages is viewed as a pathway out of poverty and toward upward mobility. Indeed, since the end of open-ended welfare benefits in 1996, U.S. public assistance presumes that creating incentives for poor adults, including mothers, to enter the paid labor force is the best way to reduce poverty and dependence on government. Yet many citizens do not understand that most poor adults already work. In fact, by some accounts the so-called working poor outnumber the non-working poor in the U.S. Effectively reducing poverty therefore requires addressing the problems of those who work yet remain poor. more...