economics

National Ugly Christmas Sweater Day has come and gone, falling this year on Friday, December 18th. Perhaps you’ve noticed the recent ascent of the Ugly Christmas Sweater or even been invited to an Ugly Christmas Sweater Party. How do we account for this trend and its call to “don we now our tacky apparel”?

Total search of term “ugly Christmas sweater” relative to other searches over time (c/o Google Trends):

Ugly Christmas Sweater parties purportedly originated in Vancouver, Canada, in 2001. Their appeal might seem to stem from their role as a vehicle for ironic nostalgia, an opportunity to revel in all that is festively cheesy. It also might provide an opportunity to express the collective effervescence of the well-intentioned (but hopelessly tacky) holiday apparel from moms and grandmas.

However, The Atlantic points to a more complex reason why we might enjoy the cheesy simplicity offered by Ugly Christmas Sweaters: “If there is a war on Christmas, then the Ugly Christmas Sweater, awesome in its terribleness, is a blissfully demilitarized zone.” This observation pokes fun at the Fox News-style hysterics regarding the “War on Christmas”; despite being commonly called Ugly Christmas Sweaters, the notion seems to persist that their celebration is an inclusive and “safe” one.

We might also consider the generally fraught nature of the holidays (which are financially and emotionally taxing for many), suggesting that the Ugly Sweater could offer an escape from individual holiday stress. There is no shortage of sociologists who can speak to the strain of family, consumerism, and mental health issues that plague the holidays, to say nothing of the particular gendered burdens they produce. Perhaps these parties represent an opportunity to shelve those tensions.

But how do we explain the fervent communal desire for simultaneous festive celebration and escape? Fred Davis notes that nostalgia is invoked during periods of discontinuity. This can occur at the individual level when we use nostalgia to “reassure ourselves of past happiness.” It may also function as a collective response – a “nostalgia orgy”- whereby we collaboratively reassure ourselves of shared past happiness through cultural symbols. The Ugly Christmas Sweater becomes a freighted symbol of past misguided, but genuine, familial affection and unselfconscious enthusiasm for the holidays – it doesn’t matter that we have not all really had the actual experience of receiving such a garment.

Jean Baudrillard might call the process of mythologizing the Ugly Christmas Sweater a simulation, a collapsing between reality and representation. And, as George Ritzer points out, simulation can become a ripe target for corporatization as it can be made more spectacular than its authentic counterparts. We need only look at the shift from the “authentic” prerogative to root through one’s closet for an ugly sweater bestowed by grandma (or even to retrieve from the thrift store a sweater imparted by someone else’s grandma) to the cottage-industry that has sprung up to provide ugly sweaters to the masses. There appears to be a need for collective nostalgia that is outstripped by the supply of “actual” Ugly Christmas Sweaters that we have at our disposal.

Colin Campbell states that consumption involves not just purchasing or using a good or service, but also selecting and enhancing it. Accordingly, our consumptive obligation to the Ugly Christmas Sweater becomes more demanding, individualized and, as Ritzer predicts, spectacular. For examples, we can view this intensive guide for DIY ugly sweaters. If DIY isn’t your style, you can indulge your individual (but mass-produced) tastes in NBA-inspired or cultural mash-up Ugly Christmas Sweaters, or these Ugly Christmas Sweaters that aren’t even sweaters at all.

The ironic appeal of the Ugly Christmas Sweater Party is that one can be deemed festive for partaking, while simultaneously ensuring that one is participating in a”safe” celebration – or even a gentle mockery – of holiday saturation and demands. The ascent of the Ugly Christmas Sweater has involved a transition from ironic nostalgia vehicle to a corporatized form of escapism, one that we are induced to participate in as a “safe” form of  festive simulation that becomes increasingly individualized and demanding in expression.

Re-posted at Pacific Standard.

Kerri Scheer is a PhD Student working in law and regulation in the Department of Sociology at the University of Toronto. She thanks her colleague Allison Meads for insights and edits on this post. You can follow Kerri on Twitter.

I don’t have much to add on the “consensus plan” on poverty and mobility produced by the Brookings and American Enterprise institutes, referred to in their launch event as being on “different ends of the ideological spectrum” (can you imagine?). In addition to the report, you might consider the comments byJeff Spross, Brad DeLong, or the three-part series by Matt Bruenig.

My comment is about the increasingly (to me) frustrating description of poverty as something beyond simple comprehension and unreachable by mortal policy. It’s just not. The whole child poverty problem, for example, amounts to $62 billion dollars per year. There are certainly important details to be worked out in how to eliminate it, but the basic idea is pretty clear — you give poor people money. We have plenty of it.

This was obvious yet amazingly not remarked upon in the first 40 minutes of the launch event (which is all I watched). In the opening presentation, by Ron Haskins — for whom I have a well-documented distaste — started with this simple chart of official poverty rates:

2

He started with the blue line, poverty for elderly people, and said:

The blue line is probably the nation’s greatest success against poverty. It’s the elderly. And it basically has declined pretty much all the time. It has no relationship to the economy, and there is good research that shows that its cause at least 90% by Social Security. So, government did it, and so Social Security is the reason we’re able to be successful to reduce poverty among the elderly.

And then everyone proceeded to ignore the obvious implication of that: when you give people money, they aren’t poor anymore. The most unintentionally hilarious illustration of this was in the keynote (why?) address from David Brooks (who has definitely been working on relaxing lately, especially when it comes to preparing keynote puff-pieces). He said this, according to my unofficial transcript:

Poverty is a cloud problem and not a clock problem. This is a Karl Popper distinction. He said some problems are clock problems – you can take them apart into individual pieces and fix them. Some problems are cloud problems. You can’t take a cloud apart. It’s a dynamic system that is always interspersed. And Popper said we have a tendency to try to take cloud problems and turn them into clock problems, because it’s just easier for us to think about. But poverty is a cloud problem. … A problem like poverty is too complicated to be contained by any one political philosophy. … So we have to be humble, because it’s so gloomy and so complicated and so cloud-like.

The good news is that for all the complexity of poverty, and all the way it’s a cloud, it offers a political opportunity, especially in a polarized era, because it’s not an either/or issue. … Poverty is an and/and issue, because it takes a zillion things to address it, and some of those things are going to come from the left, and some are going to come from the right. … And if poverty is this mysterious, unknowable, negative spiral-loop that some people find themselves in, then surely the solution is to throw everything we think works at the problem simultaneously, and try in ways we will never understand, to have a positive virtuous cycle. And so there’s not a lot of tradeoffs, there’s just a lot of throwing stuff in. And social science, which is so prevalent in this report, is so valuable in proving what works, but ultimately it has to bow down to human realities – to psychology, to emotion, to reality, and to just the way an emergent system works.

Poverty is only a “mysterious, unknowable, negative spiral-loop” if you specifically ignore the lack of money that is its proximate cause. Sure, spend your whole life wondering about the mysteries of human variation — but could we agree to do that after taking care of people’s basic needs?

I wonder if poverty among the elderly once seemed like a weird, amorphous, confusing problem. I doubt it. But it probably would if we had assumed that the only way to solve elderly poverty was to get children to give their parents more money. Then we would have to worry about the market position of their children, the timing of their births, the complexity of their motivations and relationships, the vagaries of the market, and the folly of youth. Instead, we gave old people money. And now elderly poverty “has declined pretty much all the time” and “it has no relationship to the economy.”

Imagine that.

Originally posted at Family Inequality; re-posted at Pacific Standard.

Philip N. Cohen, PhD is a professor of sociology at the University of Maryland, College Park. He is the author of The Family, a sociology of family textbook, and writes the blog Family Inequality. You can follow him on Twitter or Facebook.

They hold the same amount of wealth.

We all know that wealth is unequally distributed in the US. But, the results of a new study by the Institute for Policy Studies, authored by Chuck Collins and Josh Hoxie, are still eye popping.

Collins and Hoxie find that the wealthiest 0.1 percent of US households, an estimated 115,000 households with a net worth starting at $20 million, own more than 20 percent of total US household wealth. That is up from 7 percent in the 1970s. This group owns approximately the same total wealth as the bottom 90 percent of US households.

Moving up the wealth ladder, they calculate that the top 400 people—yes, people not households, each with a net worth starting at $1.7 billion, have more wealth than the bottom 61 percent of the US population, an estimated 70 million households or 194 million people.

Finally, we get to the top 20 people, those sitting at the pinnacle of the US wealth distribution. As the authors explain:

The wealthiest 20 individuals in the United States today hold more wealth than the bottom half of the U.S. population combined. These 20 super wealthy — a group small enough to fly together on one Gulfstream G650 private jet — have as much wealth as the 152 million people who live in the 57 million households that make up the bottom half of the U.S. population.

2

Although obvious, it is still worth emphasizing, as Collins and Hoxie do, that great wealth translates into great power, the power to shape economic policies. And, in a self-reinforcing cycle, the resulting policies, by design, create new opportunities for the wealthy to capture more wealth. Think: free trade agreements, privatization policies, tax policy, and labor and environmental laws and regulations.

Oh yes, also think presidential politics. As a New York Times study points out:

They are overwhelmingly white, rich, older and male . . . . Across a sprawling country, they reside in an archipelago of wealth, exclusive neighborhoods dotting a handful of cities and towns… Now they are deploying their vast wealth in the political arena, providing almost half of all the seed money raised to support Democratic and Republican presidential candidates. Just 158 families, along with companies they own or control, contributed $176 million in the first phase of the campaign, a New York Times investigation found (emphasis added).

And yet, one still hears some people say that class analysis has no role to play in explaining the dynamics of the US political economy. Makes you wonder who pays their salary.

Originally posted at Reports from the Economic Front.

Martin Hart-Landsberg is a professor of economics at Lewis and Clark College. You can follow him at Reports from the Economic Front.

“That’s private equity for you,” said Steve Jenkins. He was standing outside the uptown Fairway grocery at 125th St. about to go to breakfast at a diner across the street. He no longer works at Fairway.

Steve was one of the early forces shaping Fairway back when it was just one store at 74th and Broadway. He hired on as their cheese guy. “What do you want that for?” he growled at me one day long ago when he saw me with a large wedge of inexpensive brie. “That’s the most boring cheese in the store.” He was often abrasive, rarely tactful. I tried to explain that it was for a party and most of the people wouldn’t care. He would have none of it. He cared. He cared deeply – about cheese, about food generally.

He helped Fairway expand from one store to two, then four. He still selected the cheeses. He wrote the irreverent text for their signs, including the huge electric marquee that drivers on the West Side Highway read. And then in 2007 Fairway got bought out by a private equity firm. The three original founders cashed out handsomely. Steve and others stayed on. Much of their their share of the deal was in Fairway stock, but with restrictions that prevented them from selling.

Fairway kept expanding – stores in more places around New York – and they aimed more at the median shopper. Gradually, the store lost its edge, its quirkiness. With great size comes great McDonaldization – predictability, calculability. “Like no other market,” says every Fairway sign and every Fairway plastic bag. But it became like lots of other markets, with “specials” and coupons. Coupons! Fairway never had coupons. Or specials.

The people who decided to introduce coupons and specials were probably MBAs who knew about business and management and maybe even research on the retail food business. They knew about costs and profits. Knowing about food was for the people below them, people whose decisions they could override.

“I gotta get permission from corporate if I want to use my cell phone,” said Peter Romano, the wonderful produce manager at 74th St. – another guy who’d been there almost from the start. He knew produce like Steve knew cheese. Peter, too, left Fairway a few months ago.

Maybe this is what happens when a relatively small business gets taken over by ambitious suits. Things are rationalized, bureaucratized. And bureaucracy carries an implicit message of basic mistrust:

If we trusted you, we wouldn’t make you get approval. We wouldn’t make you fill out these papers about what you’re doing; we’d just let you do it. These procedures are our way of telling you that we don’t trust you to do what you say you’re doing.

The need for predictability, efficiency, and calculability leave little room for improvisation. The food business becomes less about food, more about business. It stops being fun. The trade-off should be that you get more money. But there too, Fairway’s new management disappointed. They expanded rapidly, putting new stores in questionable locations. In the first months after the private equity firm took Fairway public in 2013, the stock price was as high as $26 a share. Yesterday, it closed at $1.04. The shares that Steve Jenkins and others received as their part of the private equity buyout are practically worthless.

4

Steve Jenkins will be all right. He’s well known in food circles. He’s been on television with Rachel Ray, Jacques Pepin. Still, there he was yesterday morning outside the store whose cheeses and olive oils had been his dominion. “I’m sixty-five years old, and I’m looking for a job.”

Originally posted at Montclair SocioBlog; re-posted at Pacific Standard.

Jay Livingston is the chair of the Sociology Department at Montclair State University. You can follow him at Montclair SocioBlog or on Twitter.

The 1% in America have an out-sized influence on the political process. What policies do they support? And do their priorities differ from those of less wealthy Americans?

Political scientist Benjamin Page and two colleagues wanted to find out, so they started trying to set up interviews with the richest of the rich. This, they noted, was really quite a feat, writing:

It is extremely difficult to make personal contact with wealthy Americans. Most of them are very busy. Most zealously protect their privacy. They often surround themselves with professional gatekeepers whose job it is to fend off people like us. (One of our interviewers remarked that “even their gatekeepers have gatekeepers.”) It can take months of intensive efforts, pestering staffers and pursuing potential respondents to multiple homes, businesses, and vacation spots, just to make contact.

Persistence paid off. They completed interviews with 83 individuals with net worths in in the top 1%.  Their mean wealth was over $14 million and their average income was over $1 million a year.

Page and his colleagues learned that these individuals were highly politically active. A majority (84%) said they paid attention to politics “most of the time,” 99% voted in the last presidential election, 68% contributed money to campaigns, and 41% attended political events.

Many of them were also in contact with politicians or officials. Nearly a quarter had conversed with individuals staffing regulatory agencies and many had been in touch with their own senators and representatives (40% and 37% respectively) or those of other constituents (28%).

These individuals also reported opinions that differed from those of the general population. Some differences really stood out: the wealthy were substantially less likely to want to expand support for job programs, the environment, homeland security, healthcare, food stamps, Social Security, and farmers. Most, for example, are not particularly concerned with ensuring that all Americans can work and earn a living wage:

3

Only half think that the government should ensure equal schooling for whites and racial minorities (58%), only a third (35%) believe that all children deserve to go to “really good public schools,” and only a quarter (28%) think that everyone who wants to go to college should be able to do so.

4

The wealthy generally opposed regulation on Wall Street firms, food producers, the oil industry, the health insurance industry, and big corporations, all of which is favored by the general public. A minority of the wealthy (17%) believed that the government should reduce class inequality by redistributing wealth, compared to half of the general population (53%).

Interestingly, Page and his colleagues also compared the answers of the top 0.1% with the remainder of the top 1%. The top 0.1%, individuals with $40 million or more net worth, held views that deviated even farther from the general public.

These attitudes may explain why politicians take positions with which the majority of Americans disagree. “[T]he apparent consistency between the preferences of the wealthy and the contours of actual policy in certain important areas,” they write, “— especially social welfare policies, and to a lesser extent economic regulation and taxation — is, at least, suggestive of significant influence.”

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Daniel Drezner once wrote about how international relations scholars would react to a zombie epidemic. Aside from the sheer fun of talking about something as silly as zombies, it had much the same illuminating satiric purpose as “how many X does it take to screw in a lightbulb” jokes. If you have even a cursory familiarity with the field, it is well worth reading.

Here’s my humble attempt to do the same for several schools within sociology.

Public Opinion. Consider the statement that “Zombies are a growing problem in society.” Would you:

  1. Strongly disagree
  2. Somewhat disagree
  3. Neither agree nor disagree
  4. Somewhat agree
  5. Strongly agree
  6. Um, how do I know you’re really with NORC and not just here to eat my brain?

Criminology. In some areas (e.g., Pittsburgh, Raccoon City), zombification is now more common that attending college or serving in the military and must be understood as a modal life course event. Furthermore, as seen in audit studies employers are unwilling to hire zombies and so the mark of zombification has persistent and reverberating effects throughout undeath (at least until complete decomposition and putrefecation). However, race trumps humanity as most employers prefer to hire a white zombie over a black human.

Cultural toolkit. Being mindless, zombies have no cultural toolkit. Rather the great interest is understanding how the cultural toolkits of the living develop and are invoked during unsettled times of uncertainty, such as an onslaught of walking corpses. The human being besieged by zombies is not constrained by culture, but draws upon it. Actors can draw upon such culturally-informed tools as boarding up the windows of a farmhouse, shotgunning the undead, or simply falling into panicked blubbering.

Categorization. There’s a kind of categorical legitimacy problem to zombies. Initially zombies were supernaturally animated dead, they were sluggish but relentlessness, and they sought to eat human brains. In contrast, more recent zombies tend to be infected with a virus that leaves them still living in a biological sense but alters their behavior so as to be savage, oblivious to pain, and nimble. Furthermore, even supernatural zombies are not a homogenous set but encompass varying degrees of decomposition. Thus the first issue with zombies is defining what is a zombie and if it is commensurable with similar categories (like an inferius in Harry Potter). This categorical uncertainty has effects in that insurance underwriters systematically undervalue life insurance policies against monsters that are ambiguous to categorize (zombies) as compared to those that fall into a clearly delineated category (vampires).

Neo-institutionalism. Saving humanity from the hordes of the undead is a broad goal that is easily decoupled from the means used to achieve it. Especially given that human survivors need legitimacy in order to command access to scarce resources (e.g., shotgun shells, gasoline), it is more important to use strategies that are perceived as legitimate by trading partners (i.e., other terrified humans you’re trying to recruit into your improvised human survival cooperative) than to develop technically efficient means of dispatching the living dead. Although early on strategies for dealing with the undead (panic, “hole up here until help arrives,” “we have to get out of the city,” developing a vaccine, etc) are practiced where they are most technically efficient, once a strategy achieves legitimacy it spreads via isomorphism to technically inappropriate contexts.

Population ecology. Improvised human survival cooperatives (IHSC) demonstrate the liability of newness in that many are overwhelmed and devoured immediately after formation. Furthermore, IHSC demonstrate the essentially fixed nature of organizations as those IHSC that attempt to change core strategy (eg, from “let’s hole up here until help arrives” to “we have to get out of the city”) show a greatly increased hazard for being overwhelmed and devoured.

Diffusion. Viral zombieism (e.g. Resident Evil, 28 Days Later) tends to start with a single patient zero whereas supernatural zombieism (e.g. Night of the Living Dead, the “Thriller” video) tends to start with all recently deceased bodies rising from the grave. By seeing whether the diffusion curve for zombieism more closely approximates a Bass mixed-influence model or a classic s-curve we can estimate whether zombieism is supernatural or viral, and therefore whether policy-makers should direct grants towards biomedical labs to develop a zombie vaccine or the Catholic Church to give priests a crash course in the neglected art of exorcism. Furthermore, marketers can plug plausible assumptions into the Bass model so as to make projections of the size of the zombie market over time, and thus how quickly to start manufacturing such products as brain-flavored Doritos.

Social movements. The dominant debate is the extent to which anti-zombie mobilization represents changes in the political opportunity structure brought on by complete societal collapse as compared to an essentially expressive act related to cultural dislocation and contested space. Supporting the latter interpretation is that zombie hunting militias are especially likely to form in counties that have seen recent increases in immigration. (The finding holds even when controlling for such variables as gun registrations, log distance to the nearest army administered “safe zone,” etc.).

Family. Zombieism doesn’t just affect individuals, but families. Having a zombie in the family involves an average of 25 hours of care work per week, including such tasks as going to the butcher to buy pig brains, repairing the boarding that keeps the zombie securely in the basement and away from the rest of the family, and washing a variety of stains out of the zombie’s tattered clothing. Almost all of this care work is performed by women and very little of it is done by paid care workers as no care worker in her right mind is willing to be in a house with a zombie.

Applied micro-economics. We combine two unique datasets, the first being military satellite imagery of zombie mobs and the second records salvaged from the wreckage of Exxon/Mobil headquarters showing which gas stations were due to be refueled just before the start of the zombie epidemic. Since humans can use salvaged gasoline either to set the undead on fire or to power vehicles, chainsaws, etc., we have a source of plausibly exogenous heterogeneity in showing which neighborhoods were more or less hospitable environments for zombies. We show that zombies tended to shuffle towards neighborhoods with low stocks of gasoline. Hence, we find that zombies respond to incentives (just like school teachers, and sumo wrestlers, and crack dealers, and realtors, and hookers, …).

Grounded theory. One cannot fully appreciate zombies by imposing a pre-existing theoretical framework on zombies. Only participant observation can allow one to provide a thick description of the mindless zombie perspective. Unfortunately scientistic institutions tend to be unsupportive of this kind of research. Major research funders reject as “too vague and insufficiently theory-driven” proposals that describe the intention to see what findings emerge from roaming about feasting on the living. Likewise IRB panels raise issues about whether a zombie can give informed consent and whether it is ethical to kill the living and eat their brains.

Ethnomethodology. Zombieism is not so much a state of being as a set of practices and cultural scripts. It is not that one is a zombie but that one does being a zombie such that zombieism is created and enacted through interaction. Even if one is “objectively” a mindless animated corpse, one cannot really be said to be fulfilling one’s cultural role as a zombie unless one shuffles across the landscape in search of brains.

Conversation Analysis.2 (1)

Cross-posted at Code and Culture.

Gabriel Rossman is a professor of sociology at UCLA. His research addresses culture and mass media, especially pop music radio and Hollywood films, with the aim of understanding diffusion processes. You can follow him at Code and Culture.

There is a light bulb in a fire station in Livermore, CA that has been burning since 1901. It was manufactured in the late 1890s. And, yes, there is a BulbCam.

13

According to Hunter Oatman-Stanford, writing for Collectors Weekly, the first homes that had electricity were serviced entirely by electric companies. He explained:

Generally, customers would purchase entire electrical systems manufactured by a regional supplier who would handle installation and upkeep. If a bulb “burned out,” meaning the filament had deteriorated from repeated heating, someone would come and replace it for you [for free].

Given this business model, it made sense to try to develop bulbs that would burn out as infrequently as possible, and the goal was to make ones that would last forever. The one in Livermore was made by the Shelby Electric Company and, interestingly, no one remembers what they did to make their time-defying bulbs. For now, at least, their secrets are a mystery.

Only later, when electric companies turned over the job of replacing lightbulbs to homeowners, did they decide that it would be more profitable to make cheap bulbs that burned out frequently. As of around 1910, companies were charging the equivalent of $33 for a 1,500 hour lamp (which is about the same life of an incandescent bulb today). Yikes. At least the price has gone down.

We call this planned obsolescence: the practice of designing products with a predetermined expiration date aimed at forcing consumers into repeat purchases. Since the mid-1900s, more and more products have been literally designed to fail. In some cases, we seem to have fully accepted cyclic purchasing (think, for example, of the constant replacing of our electronic devices) or we are embarrassed into doing so (think fashion and the stigma of driving an old car). Other times, like with the lightbulb, we just assume that this is the best engineers can do.

Planned obsolescence is criticized for being wasteful. How many light bulbs sit in landfills today? How many natural resources have we extracted or burned up to make their replacements? How many cargo ships and semis have been filled with lightbulbs and taken around the world?

The little lightbulb in Livermore is a great reminder that just because we live in technologically advanced societies doesn’t mean we always have access to the most advanced technology. Other forces are at work.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

The Federal Reserve has announced that it is holding off on an interest rate hike; the last time it raised rates was in 2006.  The reason for the lack of action: the Federal Reserve believes the economy remains fragile and, since inflation remains low, it doesn’t want to do anything that might bring the expansion to a halt.

In reality our economic problems go much deeper than slow growth and economic fragility.  Bluntly said, most workers are losing ground regardless of whether the economy is in recession or expansion.

The following chart, from a New York Times article, shows the movement in real, inflation adjusted, median household income from 1999 to 2014.

2

The median household income was $53,657 in 2014.  That was 1.5 percent below what it was in 2013.  Perhaps even more disturbing, as the Times article notes:

The 2014 real median income number is 6.5 percent below its 2007, pre-crisis level. It is 7.2 percent below the number in 1999.

A middle-income American family, in other words, makes substantially less money in inflation-adjusted terms than it did 15 years ago. And there is no evidence that is reversing…

The depressing data on middle-class wages is true across almost all groups based on race and age. (One exception is a 5.3 percent gain in median wages among Hispanics in 2014, though that is within the statistical margin of error and so may not be meaningful).

And there is good reason for believing that things are unlikely to improve in the near future.  As a recent study by the National Employment Law Project makes clears, real wages are continuing to fall for most workers.

The authors of the National Employment Law Project study “calculated the percentage change in real median hourly wages from 2009 to 2014 for 785 occupations, which were grouped into quintiles, each representing approximately one-fifth of total employment in 2014.”  Figure 1 shows the change in real wages for each of the five quintiles over the period.  As we can see, real median hourly wages fell across the board, with the overall median wage falling by 4 percent.

3

Figure 2 keeps the same wage groupings but shows the change in wages for both the highest (90th percentile) and lowest (10th percentile) earners in each wage quintile. As we can see, with the exception of occupations in the lowest paid quintile, the fall in wages was greater for those in the bottom percentile than for those in the top percentile.  That said, the most striking fact is that all suffered declines in real wages.

4

Steady as she goes, which seems to be the strategy of most policy-makers, is unlikely to turn things around.

Originally posted at Reports from the Economic Front.

Martin Hart-Landsberg is a professor of economics at Lewis and Clark College. You can follow him at Reports from the Economic Front.