Mean and median are two measures of “average.” The mean is the average as we typically think of it: the sum of things divided by the total number of things. The median, in contrast, is literally the number in the middle if we align all the quantities in order. People often use median instead of mean because it is insensitive to extreme outliers which may skew the mean in one direction or another.
For a quick illustration of the difference, I often use the example of income. I choose a plausible average (mean) for the classroom population and review the math. “If Bill Gates walks into the room,” I say, “the average income is now in the billions. The median hasn’t moved, but the mean has gone way up.” So has the Gini coefficient.
Here’s a more realistic and global illustration – the net worth of people in the wealthier countries. The U.S. ranks fourth in mean worth – $301,000 per person…
…but the median is far lower – $45,000, 19th out of the twenty nations shown. (The graph is from Credit Suisse via CNN.)
The U.S. is a wealthy nation compared with others, but “average” Americans, in the way that term is generally understood, are poorer than their counterparts in other countries.
In his book by the same name, Michael Billig coined the term “banal nationalism” to draw attention to the ways in which nationalism was not only a quality of gun-toting, flag-waving “extremists,” but was quietly and rather invisibly reproduced by all of us in our daily lives.
That we live in a world of nations was not inevitable; that the United States, or Sweden or India, exist was not inevitable. I was born in Southern California. If I had been born at another time in history I would have been Mexican or Spanish or something else altogether. The nation is a social construction.
The nation, then, must be reproduced. We must be reminded, constantly, that we are part of this thing called a “nation.” Even more, that we belong to it and it belongs to us. Banal nationalism is how the idea of the nation and our membership in it is reproduced daily. It occurs not only with celebrations, parades, or patriotic war, but in “mundane,” “routine,” and “unnoticed” ways.
The American flag, for example, casually hanging around in yards and in front of buildings everywhere:
References to the nation on our money:
The way that the news is usually split into us and everyone else:
The naming of clubs and franchises, such as the National Football League, as specific to our country:
The performance of the pledge of allegiance in schools and sports arenas:
So, what? What could possibly be the problem?
Sociologists have critiqued nationalism for being the source of an irrational commitment and loyalty to one’s nation, a commitment that makes one willing to both die and kill. Billig argues that, while it appears harmless on the surface, “banal nationalism can be mobilized and turned into frenzied nationalism.” The profound sense of national pride required for war, for example, depends on this sense of nationhood internalized over a lifetime. So banal nationalism isn’t “nationalism-lite,” it’s the very foundation upon which more dangerous nationalisms are built.
You can download a more polished two-page version of this argument, forthcoming in Contexts magazine, here. Images found here, here, here, here, here, here, and here.
Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.
When my primary care physician, a wonderful doctor, told me he was retiring, he said, “I just can’t practice medicine anymore the way I want to.” It wasn’t the government or malpractice lawyers. It was the insurance companies.
This was long before Obamacare. It was back when President W was telling us that “America has the best health care system in the world”; back when “the best” meant spending twice as much as other developed countries and getting health outcomes that were no better and by some measures worse. (That’s still true).
Many critics then blamed the insurance companies, whose administrative costs were so much higher than those of public health care, including our own Medicare. Some of that money went to employees whose job it was to increase insurers’ profits by not paying claims. Back then we learned the word “rescission” – finding a pretext for cancelling the coverage of people whose medical bills were too high. Insurance company executives, summoned to Congressional hearings, stood their ground and offered some misleading statistics.
None of the Congressional representatives on the committee asked the execs how much they were getting paid. Maybe they should have.
Health care in the U.S. is a $2.7 trillion dollar business, and the New York Times has an article about who’s getting the big bucks. Not the doctors, it turns out. And certainly not the people who have the most contact with sick people – nurses, EMTs, and those further down the chain. Here’s the chart from the article, with an inset showing those administrative costs.
As fine print at the top of the chart says, these are just salaries – walking-around money an exec gets for showing up. The real money is in the options and incentives.
In a deal that is not unusual in the industry, Mark T. Bertolini, the chief executive of Aetna, earned a salary of about $977,000 in 2012 but a total compensation package of over $36 million, the bulk of it from stocks vested and options he exercised that year.
The anti-Obamacare rhetoric has railed against a “government takeover” of medicine. It is, of course, no such thing. Obama had to remove the “public option”; Republicans prevented the government from fielding a team and getting into the game. Instead, we have had an insurance company takeover of medicine. It’s not the government that’s coming between doctor and patient, it’s the insurance companies. Those dreaded “bureaucrats” aren’t working for the government of the people, by the people, and for the people. They’ve working for Aetna and Well-Point.
Even the doctors now sense that they too are merely working for The Man.
Doctors are beginning to push back: Last month, 75 doctors in northern Wisconsin [demanded] . . . health reforms . . . requiring that 95 percent of insurance premiums be used on medical care. The movement was ignited when a surgeon, Dr. Hans Rechsteiner, discovered that a brief outpatient appendectomy he had performed for a fee of $1,700 generated over $12,000 in hospital bills, including $6,500 for operating room and recovery room charges.
That $12,000 tab, for what it’s worth, is slightly under the U.S. average.
In Generation Me and The Narcissism Epidemic, psychologist Jean Twenge argues that we’re all becoming more individualistic. One measure of this is our willingness to go against the crowd. She offers many types of evidence, but I was particularly intrigued by her discussion of the afterlife of a famous experiment in psychology.
In 1951, social psychologist Solomon Asch placed eight male Swarthmore students around a table for an experiment in conformity. They were asked to consider two cards, one with three lines of differing lengths and another with one line. He asked each student, one by one, which line on the card of three was the same length as the lone line on the second card. Each group looked at 18 pairs of cards like this:
Asch was only interested in the last student’s response. The first seven were confederates. In six trials, Asch instructed all the confederates to give the correct answer. In twelve, however, the other seven would all choose the same obviouslywrong answer. Asch counted how often the eighth student would go against the crowd in these cases, breaking consensus and offering up a solitary, but correct answer.
He found that conformity was surprisingly common. Three-quarters of the study subjects incorrectly went with the majority in at least one trial and a third did so half the time or more. This was considered a stunning example of people’s willingness to lie about what they are seeing with their own eyes in order to avoid rocking the boat.
But then there was Vietnam and anti-war protesters, hippies and free love, the women’s and gay liberation movement, and civil rights victories. By the 1960s, it was all about rejecting the establishment, saying no, and envisioning a more authentic life. Things changed. And so did this experiment.
By the mid-1990s, there were 133 replications of Asch’s study. Psychologists Rod Bond and Peter Smith decided to add them all up. They found that the tendency for individuals to conform to the group fell over time.
One of the abstract take-away points from this is that our psychologies — indeed, even our personalities — are malleable. In fact, the results of many studies, Twenge writes, suggest that “when you were born has more influence on your personality than the family who raised you.” When encountering claims of timeless and cultureless truths about human psychology, then, it is always good to ask ourselves what scientists might find a few decades later.
A new study of 10,000 Americans by the Pew Research Center finds that political polarization is more extreme than it’s been anytime in the last 20 years. The median (or middle) Democrat and Republican are farther away from each other politically than in 2004 or 1994. “Today,” reports Pew, “92% of Republicans are to the right of the median Democrat, and 94% of Democrats are to the left of the median Republican.”
Animosity has grown as well. Over a quarter of Democrats and a third of Republicans see the other side as a “threat to the nation’s well being.” In total, 38% of Democrats and 43% of Republicans judge the other side to be “very unfavorable.”
Even more dramatically, it is the people at the extremes who are most likely to vote in elections and contribute to candidates. Today’s America is highly polarized, then, but the voting booth is even more so.
Pew concludes by noting that, even given this polarization, the majority of Americans are in the middle and are open to compromise between parties. These individuals, however, are less politically active, whether out of disinterest or distaste for the rancor, leaving politics to the most extreme among us.
On any given workday, over 31 million lunches are served to children in school cafeterias. Part of the U.S. Department of Agriculture’s (USDA) nutritional assistance efforts, the National School Lunch Program (NSLP) aims to deliver affordable and nutritious meals to the nation’s schoolchildren. After all, food plays a key part in helping them learn, grow, and thrive.
To reach those who need it most, the federal and local governments work together to offer free lunch to children whose parents cannot afford to pay for it. But money is just one way a meal can be compensated for: the ‘free’ school lunch comes at other costs.
As a result, nutrition was of secondary concern to them: one year, eggs would be on the menu daily; another, they would hardly make an appearance. It wasn’t until the war, when politicians grew concerned about the ability of the nation’s men to fight, and until it became apparent hungry children don’t do well in classrooms they were newly required to sit in, that anyone took a serious look at what kids at school were actually eating.
Photo: Gary Tramontina (New York Times)
By that time, it was too late. The program was already run like a business, and not even the introduction of nutritional standards helped. Today, these normatives are outdated – children snack rather than eat three square meals, and are less physically active, requiring fewer calories – and almost impossible to follow with the budget restrictions school lunch planners face.
The private industry was quick to offer solutions, but is more interested in profits than schoolchildren’s waistlines. Enriched and fortified chips and candies of otherwise dubious nutritional value appear in school cafeterias and vending machines, often a more popular choice with kids than apples. Frozen and convenience foods are replacing fresh meals cooked on premises. And the labyrinthine regulations of meal calorie contents coupled with cafeteria financial realities often mean adding more sugar to students’ plates is the only thing that can bring down its fat content, for example.
The food itself is not the only factor contributing to children’s undesirable health outcomes. Economist Rachana Bhatt finds the amount of time students have to enjoy lunch also matters. Students tight on time – they must squeeze all getting to the cafeteria, standing in line, eating their food, and cleaning up into their lunch break – might choose to skip the meal, leading them to overeat later, or eat quicker, leading them to consume more due to the delay in feeling full. Even if all school lunches offered healthy options, time would complicate their relationship with health outcomes: Bhatt found students who had less time for lunch were more likely to be overweight.
The lunch may be free when children choose their meal and sit down to eat it, then. But it may come at a substantial cost several years down the line, when a young adult is paying for diabetes medication and visits to the doctor to monitor their blood pressure.
Read Part II of “No Such Thing as a Free School Lunch.”
This February, president Obama sat down for dinner with his visiting French colleague, François Hollande. In the company of the first lady, other government officials, and some celebrities, the men enjoyed an appetizer of Illinois caviar, Pennsylvania quail eggs, and 12 US varieties of potatoes. The main dish was a Colorado beef steak with mushrooms, Vermont cheese, and salad, followed by a dessert of Hawaiian chocolate cake, Florida tangerines, and Pennsylvania vanilla ice-cream. Three types of wine accompanied the meal. Not just any types of wine: they were American wines made by French-born winemakers.
Like the food, nothing in this meal was left to chance. But why was the encounter so carefully planned? Would it make a difference if, to celebrate the French-American friendship, the presidents raised a glass of Italian wine instead?
Food provides us with much more than physical sustenance: it is a symbol of relationships among individuals and groups. What was at stake at the February state dinner was not just pleasing the presidents’ palates, but nurturing ties within and between entire nations.
Photo: Dominic Episcopo
Imagine, first, that the diners were served tortillas or spaghetti as a main course instead of the dry-aged, family-owned-farm-raised rib eye beef steak they had. The former quickly evoke images of Mexico and Italy, while the latter tells a distinctly American story.
Serving dishes associated with particular countries is one way of fostering an imagined community – a nation state – which Benedict Anderson describes as being too great to be maintained by personal relationships, and one that must be continuously symbolized in order to persist. Especially on celebratory occasions, food takes part in producing and communicating national identities.
State dinners aren’t the only such example: another is the festive food used in New Year’s meals. The Vietnamese will eat a tet cake, the Belgians will have smoutebollen, and Slovenians will always have potica. In a melting pot nation, sending a message of a coherent community is even more important. France used banquets in it post-revolutionary times to bring together citizens in defiance of regionally specific gastronomies, writes Julia Csergo. Similarly, during the state dinner, a steak symbolizing quintessential America amidst its diversity was the star of the presidents’ meal.
And imagine, second, what would happen if president Hollande refused any part of the meal. If he skipped the cheese, we might think he is suspicious of the way the U.S. regulates its dairy industry. If he only finished half his potatoes, does that mean American produce does not taste good enough for the French? And if he rejected the dinner invitation to begin with, does this indicate the French dislike the US altogether?
Such presidential gestures would transcend his individual palate. Two political representatives sharing a meal are not only communicating their own food preferences, they are shaping a relationship between two communities. Using commensality as a political instrument is as old as the feasts of ancient Greeks and Romans, writes Richard Ascough: the banquets that took place on special occasions served to maintain connections with gods as much as to foster connections between citizens and forming a political identity. Those who partook in the meal were considered part of a tight group, while those who were not invited, or worse yet, refused the invitation, cast themselves as outsiders. The American and the French presidents enjoying a meal together, then, symbolizes the nations’ peaceful coexistence and firm diplomatic ties.
Offering a bottle of Italian wine instead of a French-American one during the state dinner would not be a disaster, but it would certainly convey a different message, one perhaps of a somewhat colder relationship. But if we are to believe Mary Douglas’ classical 1972 text, Deciphering a Meal, just the fact the presidents were sharing more than drinks is promising: we are almost never reluctant to share a drink with strangers, while sharing meals tends to be reserved for those to whom we wish to signal intimacy. The state dinner, conveniently held right before Valentine’s day, was a political sign of affection.
Teja Pristavec is a graduate student in the sociology department, and an IHHCPAR Excellence Fellow, at Rutgers University. She blogs at A Serving of Sociology.