- PhD Comics.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
How has the distribution of college majors changed? This graph, borrowed from A Backstage Sociologist, shows bachelor’s degrees conferred in the 1970-71 academic year and those conferred 41 years later.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
On any given workday, over 31 million lunches are served to children in school cafeterias. Part of the U.S. Department of Agriculture’s (USDA) nutritional assistance efforts, the National School Lunch Program (NSLP) aims to deliver affordable and nutritious meals to the nation’s schoolchildren. After all, food plays a key part in helping them learn, grow, and thrive.
To reach those who need it most, the federal and local governments work together to offer free lunch to children whose parents cannot afford to pay for it. But money is just one way a meal can be compensated for: the ‘free’ school lunch comes at other costs.
First, there are the health costs. At its inception, the NSLP was not designed as a social program. Instead, it was a response to agricultural overproduction and a surplus of farm produce, writes historian Susan Levine. The policymakers’ goal was to get rid of excess foods while supporting domestic production.
As a result, nutrition was of secondary concern to them: one year, eggs would be on the menu daily; another, they would hardly make an appearance. It wasn’t until the war, when politicians grew concerned about the ability of the nation’s men to fight, and until it became apparent hungry children don’t do well in classrooms they were newly required to sit in, that anyone took a serious look at what kids at school were actually eating.
By that time, it was too late. The program was already run like a business, and not even the introduction of nutritional standards helped. Today, these normatives are outdated – children snack rather than eat three square meals, and are less physically active, requiring fewer calories – and almost impossible to follow with the budget restrictions school lunch planners face.
The private industry was quick to offer solutions, but is more interested in profits than schoolchildren’s waistlines. Enriched and fortified chips and candies of otherwise dubious nutritional value appear in school cafeterias and vending machines, often a more popular choice with kids than apples. Frozen and convenience foods are replacing fresh meals cooked on premises. And the labyrinthine regulations of meal calorie contents coupled with cafeteria financial realities often mean adding more sugar to students’ plates is the only thing that can bring down its fat content, for example.
The food itself is not the only factor contributing to children’s undesirable health outcomes. Economist Rachana Bhatt finds the amount of time students have to enjoy lunch also matters. Students tight on time – they must squeeze all getting to the cafeteria, standing in line, eating their food, and cleaning up into their lunch break – might choose to skip the meal, leading them to overeat later, or eat quicker, leading them to consume more due to the delay in feeling full. Even if all school lunches offered healthy options, time would complicate their relationship with health outcomes: Bhatt found students who had less time for lunch were more likely to be overweight.
The lunch may be free when children choose their meal and sit down to eat it, then. But it may come at a substantial cost several years down the line, when a young adult is paying for diabetes medication and visits to the doctor to monitor their blood pressure.
Read Part II of “No Such Thing as a Free School Lunch.”
Teja Pristavec is a graduate student in the sociology department, and an IHHCPAR Excellence Fellow, at Rutgers University. She blogs at A Serving of Sociology, where this post originally appeared. Cross-posted at Pacific Standard.
The short answer is, pretty well. But that’s not really the point.
In a previous post I complained about various ways of collapsing data before plotting it. Although this is useful at times, and inevitable to varying degrees, the main danger is the risk of inflating how strong an effect seems. So that’s the point about teen test scores and adult income.
If someone told you that the test scores people get in their late teens were highly correlated with their incomes later in life, you probably wouldn’t be surprised. If I said the correlation was .35, on a scale of 0 to 1, that would seem like a strong relationship. And it is. That’s what I got using the National Longitudinal Survey of Youth. I compared the Armed Forces Qualifying Test scores, taken in 1999, when the respondents were ages 15-19 with their household income in 2011, when they were 27-31.
Here is the linear fit between between these two measures, with the 95% confidence interval shaded, showing just how confident we can be in this incredibly strong relationship:
That’s definitely enough for a screaming headline, “How your kids’ test scores tell you whether they will be rich or poor.” And it is a very strong relationship – that correlation of .35 means AFQT explains 12% of the variation in household income.
But take heart, ye parents in the age of uncertainty: 12% of the variation leaves a lot left over. This variable can’t account for how creative your children are, how sociable, how attractive, how driven, how entitled, how connected, or how White they may be. To get a sense of all the other things that matter, here is the same data, with the same regression line, but now with all 5,248 individual points plotted as well (which means we have to rescale the y-axis):
Each dot is a person’s life — or two aspects of it, anyway — with the virtually infinite sources of variability that make up the wonder of social existence. All of a sudden that strong relationship doesn’t feel like something you can bank on with any given individual. Yes, there are very few people from the bottom of the test-score distribution who are now in the richest households (those clipped by the survey’s topcode and pegged at 3 on my scale), and hardly anyone from the top of the test-score distribution who is now completely broke.
But I would guess that for most kids a better predictor of future income would be spending an hour interviewing their parents and high school teachers, or spending a day getting to know them as a teenager. But that’s just a guess (and that’s an inefficient way to capture large-scale patterns).
I’m not here to argue about how much various measures matter for future income, or whether there is such a thing as general intelligence, or how heritable it is (my opinion is that a test such as this, at this age, measures what people have learned much more than a disposition toward learning inherent at birth). I just want to give a visual example of how even a very strong relationship in social science usually represents a very messy reality.Philip N. Cohen is a professor of sociology at the University of Maryland, College Park, and writes the blog Family Inequality. You can follow him on Twitter or Facebook.
Do Millennials really carry more debt than their parents and grandparents did at their age? Yes, according to a new study by sociologist Jason Houle. “In order to participate in society and gain economic independence,” he writes, “many young adults today must take a massive financial risk.” Or, as he puts it, “out of the nest and into the red.”
The graph below compares the amount of debt held by three generations in young adulthood (adjusted for inflation and controlled for other variables). Notice that the median debt load has grown, but the average debt load has grown much faster. This means that, while debt has grown over all, averages are also pulled up by a small number of young people that have really high levels.
Some evidence suggests that high debt individuals may be coming from lower income families. They take on debt as young people because the adults in their lives have already maxed out. They can’t count on their parents, for example, to take out a second mortgage on the house in order to pay for their college education. So, if they want to go to college, they have to take on the debt themselves.
Houle’s analysis, however, also shows that the kind of debt has changed across the three generations. The pie charts below reveal that the proportion of debt accounted for by home or car loans has shrunk, while the proportion accounted for by education loans and unsecured debt, like credit cards, has risen.
Moreover, Houle argues that this profile of generation Y’s debt is class specific:
The more advantaged are able to take on debt that helps them pursue a middle class lifestyle and build their wealth, while the less advantaged must take on debt to pay their bills and keep their heads above water.
So, is massive financial risk the new recipe for success?
For some, the answer may be yes. But for many, the gamble does not pay off. Students that take out college loans, for example, are more likely to drop out of college than those who have a parent that can pay. The combination of school loans and minimum-wage jobs can add up to a lifetime of economic insecurity. But, without other resources, not risking at all almost guarantees failure in this economy. For this reason, Houle argues, the availability of credit and acquisition of debt may be just another driver of income and wealth inequality. It’s a disturbing story that you can read in more depth here.Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
Apparently universities are issuing guidelines to help professors consider adding “trigger warnings” to syllabi for “racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression,” and to remove triggering material when it doesn’t “directly contribute to learning goals.” One example given is Chinua Achebe’s “Things Fall Apart” for its colonialism trigger. This from New Republic this week.
I have no desire to enter the fray of online discussions on trigger warnings and sensitivity. I have used trigger warnings. Most recently, I made a personal decision to not retweet Dylan Farrow’s piece in the New York Times detailing Woody Allen’s sexual abuse. I was uncomfortable shoving a very powerful description at people without some kind of warning. I couldn’t read past the first three sentences. I couldn’t imagine how it read for others. So, I referenced the article with a trigger warning and kept it moving.
But, I’m not sure that’s at all the kind of deliberation universities are doing with their trigger warning policies. Call me cynical, but the “student-customer” movement is the soft power arm of the neo-liberal corporatization of higher education. The message is that no one should ever be uncomfortable because students do not pay to feel things like confusion or anger. That sounds very rational until we consider how the student-customer model doesn’t silence power so much as it stifles any discourse about how power acts on people.
I’ve talked before about how the student-customer model becomes a tool to rationalize away the critical canon of race, sex, gender, sexuality, colonialism, and capitalism.
The trigger warned syllabus feels like it is in this tradition. And I will tell you why.
In the last three weeks alone: a college student has had structural violence of normative harassment foisted on her for daring to have sex (for money), black college students at Harvard have taken to social media to catalog the casual racism of their colleagues, and black male students at UCLA made a video documenting their erasure.
It would seem that the most significant “issue” for a trigger warning is actual racism, sexism, ableism, and systems of oppression. Cause I’ve got to tell you, I’ve had my crystal stair dead end at the floor of racism and sexism and I’ve read “Things Fall Apart.” The trigger warning scale of each in no way compares.
Yet, no one is arguing for trigger warnings in the routine spaces where symbolic and structural violence are acted on students at the margins. No one, to my knowledge, is affixing trigger warnings to department meetings that WASP-y normative expectations may require you to code switch yourself into oblivion to participate as a full member of the group. Instead, trigger warnings are being encouraged for sites of resistance, not mechanisms of oppression.
At for-profit colleges, strict curriculum control and enrollment contracts effectively restrict all critical literature and pedagogy. We elites balk at such barbarism. What’s a trigger warning but the prestige university version? A normative exclusion as opposed to a regulatory one?
Trigger warnings make sense on platforms where troubling information can be foisted upon you without prior knowledge, as in the case of retweets. Those platforms are in the business of messaging and amplification.
That is an odd business for higher education to be in… unless the business of higher education is now officially business.
In which case, we may as well give up on the tenuous appeal we have to public good and citizenry-building because we don’t have a kickstand to lean on.
If universities are not in the business of being uncomfortable places for silent acts of power and privilege then the trigger warning we need is: higher education is dead but credential production lives on; enter at your own risk.
Tressie McMillan Cottom is a PhD candidate in the Sociology Department at Emory University in Atlanta, GA. Her doctoral research is a comparative study of the expansion of for-profit colleges. You can follow her on twitter and at her blog, where this post originally appeared.
The narrative of the American Dream is one of upward mobility, but there are some stories of mobility we prize above others. Who is more successful: a Mexican-American whose parents immigrated to the U.S. with less than an elementary school education, and who now works as a dental hygienist? Or a Chinese-American whose parents immigrated to the U.S. and earned Ph.D. degrees, and who now works as a doctor?
Amy Chua (AKA “Tiger Mom”) and her husband Jed Rubenfeld, author of the new book The Triple Package, claim it’s the latter. They argue that certain American groups (including Chinese, Jews, Cubans, and Nigerians) are more successful and have risen further than others because they share certain cultural traits. Chua and Rubenfeld bolster their argument by comparing these groups’ median household income, test scores, educational attainment, and occupational status to those of the rest of the country.
But what happens if you measure success not just by where people end up — the cars in their garages, the degrees on their walls — but by taking into account where they started? In a study of Chinese-, Vietnamese-, and Mexican-Americans in Los Angeles whose parents immigrated here, sociologist Min Zhou and I came to a conclusion that flies in the face of Chua and Rubenfeld, and might even surprise the rest of us: Mexicans are L.A.’s most successful immigrant group.
Like Chua and Rubenfeld, we found that the children of Chinese immigrants exhibit exceptional educational outcomes that exceed those of other groups, including native-born Anglos. In Los Angeles, 64 percent of Chinese immigrants’ children graduated from college, and of this group 22 percent also attained a graduate degree. By contrast, 46 percent of native-born Anglos in L.A. graduated from college, and of this group, just 14 percent attained graduate degrees. Moreover, none of the Chinese-Americans in the study dropped out of high school.
These figures are impressive but not surprising. Chinese immigrant parents are the most highly educated in our study. In Los Angeles, over 60 percent of Chinese immigrant fathers and over 40 percent of Chinese immigrant mothers have a bachelor’s degree or higher.
At what seems to be the other end of the spectrum, the children of Mexican immigrants had the lowest levels of educational attainment of any of the groups in our study. Only 86 percent graduated from high school — compared to 100 percent of Chinese-Americans and 96 percent of native-born Anglos — and only 17 percent of graduated from college. But their high school graduation rate was more than double that of their parents, only 40 percent of whom earned diplomas. And, the college graduation rate of Mexican immigrants’ children more than doubles that of their fathers (7 percent) and triples that of their mothers (5 percent).
There is no question that, when we measure success as progress from generation to generation, Mexican-Americans come out ahead.
A colleague of mine illustrated this point with a baseball analogy: Most Americans would be more impressed by someone who made it to second base starting from home plate than someone who ended up on third base, when their parents started on third base. But because we tend to focus strictly on outcomes when we talk about success and mobility, we fail to acknowledge that the third base runner didn’t have to run far at all.
This narrow view fuels existing stereotypes that Chua and Rubenfeld play into — that some groups strive harder, have higher expectations of success, and possess a unique set of cultural traits that propels them forward.
For at least a generation, Americans have been measuring the American Dream by the make of your car, the cost of your home, and the prestige of the college degree on your wall. But there’s a more elemental calculation: Whether you achieved more than the generation that came before you. Anyone who thinks the American Dream is about the end rewards is missing the point. It’s always been about the striving.
Jennifer Lee, PhD, is a sociologist at the University of California, Irvine. Her book, The Diversity Paradox, examines patterns of intermarriage and multiracial identification among Asians, Latinos, and African Americans.
A popular quote urges us to shoot for the moon: even if we miss, it tells us, we’ll land among the stars. According to new research, there’s more to it than cheesy inspiration. Using data from two waves of the National Longitudinal Survey of Youth, sociologist John Reynolds and Chardie Baird test the common notion that failing to attain as much education as expected is associated with symptoms of depression in early/middle adulthood.
First, their results show that individuals with lower levels of education are more likely to exhibit signs of depression.
But, further statistical wrangling shows that their depression doesn’t come from the gap between plans and achievement. It comes from the low level of educational attainment in itself.
Reynolds and Baird conclude that there are no long-term emotional costs to aiming high and falling short when it comes to educational aspirations. This contradicts decades of research that holds that unmet educational expectations lead to psychological distress. In fact, not trying is the only way to ensure lower levels of education and increased chances of poor mental health. So, go ahead and shoot for that moon.