Picture by CC0 Creative Commons

Originally published in the Harvard Business Review

Few people today call a doctor when they feel a bout of nostalgia coming on. But for 200 years, nostalgia was considered a dangerous disease that could trigger delusions, despair, and even death. A 17th-century Swiss physician coined the word to describe the debilitating algos (pain) felt by people who had left their nostos (native home). In the U.S. during the Civil War, Union Army doctors reported 5,000 serious cases of nostalgia, leading to 74 deaths. In Europe, physicians anxiously debated how to treat home-sickness and contain its spread.

Alarm waned toward the end of the 19th century, as experts came to believe that “modern industry” and “rapid communications” were making people more open to change and hence more resistant to the disease. And by the 20th century, researchers had begun to recognize a milder form of nostalgia that is actually quite healthy: a longing to reproduce a feeling once experienced with friends or family, rather than to literally return to another place or time. This kind of nostalgia makes people feel warmer themselves and act more warmly toward others, including strangers.

In recent decades, however, we have seen a revival of the more pernicious form of nostalgia, what we might call past-sickness. This is the longing to reproduce an idealized piece of history. When people are collectively nostalgic about their past experiences as members of a group or as inhabitants of an era, rather than individually nostalgic for their personal experiences, they start to identify more intensely with their own group and to judge members of other groups more negatively. They become less optimistic about their ability to forge new connections — and more hostile to people perceived as outsiders. When such nostalgia gets politicized, it can lead to delusions about a mythical, magical Golden Age of the homeland, supposedly ruined by interlopers.

Collective nostalgia invariably involves a denial of the racial, ethnic, and family diversity of the past, as well as its social injustices, creating romanticized myths that are easily refuted by anyone willing to confront historical realities. But the cure to the pathologies of past-sickness does not lie in the equally romanticized vision of modernization and innovation we have been offered for the last 40 years — something that might be called future nostalgia, or modernization-sickness.

For much of the 20th century, it was possible to argue that the inequities of life stemmed from the incomplete expansion of technology, industry, and the market, and would be resolved by further modernization. But for several decades it’s been clear that the gains of modernization for some have produced substantial losses for others. While the innovations of the past 40 years have opened more opportunities for professionals and affluent entrepreneurs than they have closed off, that’s not the case for many working-class, small-town, and rural men and women. The failure of policy makers and opinion leaders to acknowledge their losses has left the pain of the “losers” to curdle into a toxic mix of nationalism, racism, and conspiracy theories across Europe and the U.S.

Despite institutionalized discrimination, working-class Americans of all races made significant economic progress in the 35 years following World War II. While it’s true that white male workers were given preference over minorities and women in hiring and pay, most of the gains made by white working-class men in that era came not from their advantages over minorities but from their greater bargaining power vis-à-vis employers. The greater prevalence and power of unions was a huge factor, and although minority and female workers were only gradually admitted to those, strong unions tend to pull up wages in other sectors of the economy and act as a counterweight to business influence over government policy.

In that environment, labor took home a much larger share of economic growth than it does today. From 1947 to the start of the 1970s, every successive cohort of young men earned, on average, three times as much in constant dollars as their fathers had at the same age. And in every single economic expansion in those same years, 70% to 80% of the income growth went to the bottom 90% of the population. Economic disparities between big urban centers, small towns, and rural areas steadily narrowed.

Since the late 1970s, a very different set of trends has prevailed. Between 1980 and 2007, even before the Great Recession hit, the median real earnings of men age 25 to 34 with a high school diploma declined by 28%. Since 1980 every cohort of young men has earned less, on average, than their fathers did at the same age. Meanwhile, in periods of economic expansion the top 10% of earners have taken 95% or more of income growth. Similar increases in inequality have occurred in Europe and elsewhere. A new Oxfam study reports that the richest 1% of the world cornered 82% of the wealth created in 2017.

The reaction of the “creative classes” to these trends has been cavalier to say the least. Despite the clear signs of working-class distress in the 1980s and early 1990s, most pundits insisted that the real story of the era was “the explosion” of new and ever-cheaper consumer conveniences produced by technological advances and globalization. Economist Robert Samuelson dismissed worries about job losses and wage cuts as “alarmist hype” that had American families “feeling bad about doing well.” Conservative columnist George Will speculated that modern affluence had produced so much “leisure, abundance, and security” that our brains, which evolved to deal with constant hazards, had gotten “bored.” Even the socially conscious Microsoft founder Bill Gates was complacent: “Entire professions and industries will fade. But new ones will flourish….The net result is that more gets done, raising the standard of living in the long run.”

During the Great Recession, pundits briefly discovered that “average” increases in income often mask serious inequalities, but that went out the window as soon as the economy started growing again. Last fall the chief global strategist at Morgan Stanley brushed aside worries about job losses due to automation, arguing that “when new technology destroys, it leaves behind a layer of ash in which new jobs grow.” This January, after yet another year of global job gains without wage gains, a writer in Bloomberg News breezily announced that “brisk growth that’s not shared by all is better than no growth at all.” Besides, “there’s basically no country in the world where the consumer is not doing well,” added Bart van Ark, chief economist at The Conference Board.

As for the people who actually provide those affordable consumer goods and services? In the U.S., the “recovery” exacerbated the 40-year rise in economic inequality and insecurity. A survey of the job and business gains in the U.S. between 2011 and 2015 found that most were confined to the wealthiest 20% of zip codes in the country. The bottom 60% of zip codes together got just one in four of the new jobs created in those years. And the 20% of zip codes that were most distressed before the recession continued to lose jobs and businesses throughout the “recovery.” In 2007 the bottom 90% of the population held 28.6% of America’s total wealth. As of 2016, that had fallen to 22.8%.

 Despite futurist predictions that the information revolution would lead to the “death of distance,” a few coastal enclaves and political or technical centers have continued to garner a disproportionate share of resources, reversing the 40 years of economic convergence among regions that occurred after 1940. The average per capita income advantage of Washington, DC and New York City over the rest of the country doubled between 1980 and 2013. Average airfares per mile to “loser” regions are now often nearly twice as high as to the “winners,” while many towns have lost rail service altogether.

Like nostalgia epidemics of the past, our recent outbreak was triggered by an understandable sense of loss and disorientation. But there’s an interesting difference between past and present in the groups most vulnerable to the disease. From the 17th to the 19th century, pathological nostalgia was seen most often among people who moved away from the communities in which they had been raised — often bettering themselves materially but feeling lost and isolated in their new surroundings. Today the upwardly and geographically mobile have easy access to new technologies, professional networks, and flexible work and consumption techniques that allow them to navigate unfamiliar territory and make themselves at home wherever they go.

Those same innovations, however, have marginalized individuals whose identity, security, and livelihood depend on their familiarity with a particular place and set of skills, and their placement within long-standing personal networks that involve relations of mutual dependence and reciprocity. These include industrial workers who get jobs at a local factory because a relative puts in a good word with the foreman; farmers, feed suppliers, and farm equipment mechanics who rely on clients or employees who are also neighbors; and local businesses that depend on personal connections with their customers.

Today the most debilitating nostalgia is found among those who cannot or do not want to move — and should not have to — but see the traditional sources of security that their native land, or nostos, once provided being dismantled or relocated, while their habits, skills, and social relationships are devalued. Instead of leaving their homes behind, they feel left behind in their homes.

As always, working-class African Americans, Latinos, and Native Americans suffer disproportionately from job losses, wage cuts, and increased volatility. Zip codes where most residents are racial or ethnic minorities are twice as likely as predominantly white zip codes to be in economic distress. Still, whites account for a significant portion — 44% — of the more than 52 million Americans in the most distressed communities. This shared exclusion from the rewards of modernization ought to be a source of solidarity, not division, but division is what happens when one group romanticizes where we’ve come from and another romanticizes where we’re going, instead of carefully examining the gains, losses, and hard trade-offs of the here-and-now.

To cure this outbreak of past-sickness, the winners in this system must stop pretending that the answer is more of the same, with a little more diversity at the top. To make modernization work for all, we must take a more critical look at how we measure economic and technological progress. Self-driving cars and delivery drones may save some people time and money, but they take away other people’s livelihoods. To stem the contagion of pathological nostalgia, we need to inoculate ourselves with a dose of the healthy nostalgia that spurs us to integrate the best values and ideas of the past into the improvements and advances we promote.

One of those values is the traditional democratic belief that the people who grow our food, make our coffee, fix our cars, educate our children, nurse our sick, and pick up our garbage are at least as essential to a healthy society as the people who invent new algorithms for stock trading, social media, and marketing. They deserve to live in thriving communities, send their kids to good schools, earn a living wage, and get home in time to enjoy dinner with whomever they count as family.

Stephanie Coontz is the CCF Director of Research and Education and a Professor of History at The Evergreen State College.