discourse/language

A couple of years ago I posted a segment from Sesame Street featuring Jesse Jackson leading kids in a chant of “I am somebody,” including the lines “I may be poor” and “I maybe on welfare.” I wrote about the changes in public discourse about welfare since the 1970s, and how surprising the segment seems now.

Aliyah C. sent in two more Sesame Street videos that illustrate changing norms, particularly regarding what we think it’s acceptable to expose children to. In both cases, a woman is breastfeeding her child in public (in the first case, openly; in the second, covered by a blanket) and explains to an onlooker that the baby is drinking milk from her breast:

Despite the fact that breastfeeding is widely hailed now as the ideal method of feeding babies, Aliyah said it was hard for her to imagine the topic being treated so casually on a children’s show now, or a woman using the word “breast” on Sesame Street without the show facing a lot of outrage.

Recently, Elizabeth Warren — Harvard Law professor and Massachusetts Senate candidate — was filmed discussing arguments that efforts to raise taxes on extremely high income earners is “class warfare,” an increasingly common refrain. She responds to this line of argument by questioning the individualist narrative of wealth — that is, that people who are rich did it all on their own, and thus owe nothing to society. As she points out, taxpayer-funded infrastructure and services — from highways to law enforcement to widely-available education — are essential elements of such financial success stories. But current discourse about wealth and taxes obscures the social nature of wealth creation, portraying taxation as unfair taking rather than a fair return on the public’s investment:

Transcript after the jump.

more...

Recently, Raz sent in this image of cans of WD-40, part of their Collectible Military Series, for sale at an auto parts store:

The types of war-related advertising we see can give us insights about how average Americans are connected to, and affected by, different wars. During many U.S. wars, contributing to the war effort was the duty of every citizen; this is particularly apparent with World War II. The draft, the deployment of some 16 million Americans, and public calls to purchase war bonds and ration food meant that war was nearly everyone’s concern. In contrast, the current War on Terrorism mostly only impacts those connected directly to it—military families. There are no widespread calls to ration, buy war bonds, or otherwise support the war effort through employment, growing vegetables, saving scrap metal, or other changes to our daily lives. My own research shows that members of military families feel the war is ignored and forgotten by most Americans. They feel isolated in their daily anxieties and their efforts to support their loved ones.

Products like the WD-40 Collectible Military Series were more common during WWII than they are now. During WWII advertising used the war cause and feelings of patriotism to sell a wide range of products that, ads argued, would help the U.S. win. Some were clearly connected to the war effort:

With others, the connection was much less obvious or direct:

Both Shlitz and Camel donated to the war effort. Similarly, with their “Drop and Give Me 40” campaign, WD-40 is donating part of their profits to charities that support service members and their families:

For each can purchased from March 2011 through May 2011, WD-40 Company donated 10 cents to three charities that help active-duty military, wounded warriors, retired veterans and their families. On Memorial Day, WD-40 Company presented $100,000 checks to each of the following military charities: Armed Services YMCA, Wounded Warrior Project, and the Veterans Medical Research Foundation.

Although military-themed products (aside from “support the troops” t-shirts, stickers and pins that are widely available) are not as common as they were during WWII, some companies have come out with patriotic advertising.

Goodyear has “support the troops” tires, sold and marketed at NASCAR races:

An Anheuser-Busch commercial shows ordinary Americans stopping their everyday lives to thank the troops. There is no mention of the company until the very end, and nothing at all about beer:

American Airlines has a similar advertisement depicting various Americans being supportive the troops before and during their flight:

The messages in these recent ads are markedly different than the WWII messages of everyone taking part and working toward victory, reflecting changing relationships between war efforts and the average citizen. No reminder of the war was necessary in the 1940s—war was a part of everyday Americans’ lives. Current ads, like the WD-40 series, often serve less as a call to specific action than as a reminder that the war exists, as a reminder to thank the troops and support service members. It’s a different type of message for a different type of war, one that only involves a small fraction of Americans and is often largely invisible to everyone else.

The World Health Organization (WHO) defines neurological disorders as physical diseases of the nervous system and psychiatric illnesses as disorders that manifest as abnormalities of thought, feeling, or behaviour. In fact, however, there are longstanding unresolved debates on the exact relationship between neurology and psychiatry, including whether there can be any clear division between the two fields.

Related to this, Brandy B. sent us a figure from the blog Neuroskeptic graphing the proportion of journal articles on various disorders included in The American Journal of Psychiatry versus the journal Neurology over the past 20 years. The image is interesting from a sociological standpoint in that, as Brandy writes, “it says far more about the sociology of these fields than about which disorders can be considered neurological or psychiatric.”

While debates regarding the neurological roots of psychiatric illnesses such as depression and schizophrenia are far from settled, the graph shows that the two disciplines have maintained varying levels of intellectual authority over different disorders. Some fall clearly into one domain or the other, while others are covered in both. Depression, for example, receives more attention than mania in Neurology, despite the fact that mania often occurs alongside depression as a symptom of bipolar disorder.

The information in this graph serves as a reminder that what gets published in academic journals, and the topics over which disciplines exercise authority, are the results of social processes. Disciplines are artificial categories of knowledge, solidified through the creation of institutional structures like university departments, degree programs, and academic journals. Psychiatry, for example, didn’t emerge as a discipline until the 19th century; this emergence was rooted in a social context in Western Europe where rising numbers of people were being institutionalized and attitudes regarding the treatment of mental illness were changing. By claiming membership in disciplines based on common academic backgrounds, research methodologies, and topics of study, scholars contribute to the reproduction of these disciplinary boundaries.

The peer-review process is one facet of this social reproduction of disciplinary boundaries that is particularly relevant to the image above. Research and papers that are submitted, accepted, and funded must appeal to reviewers and conform to the criteria set out by the journal or discipline within which researchers wish to publish. In the case of neurology and psychiatry, it appears based on this graph that the peer-review process may uphold disciplinary boundaries, as reviewers for each discipline’s journal appear to favour articles on certain disorders.

The divisions between neurology and psychiatry suggested in the image above stir up lots of interesting questions not only about what we consider to be “neurological” or “psychiatric”, but more generally about the social production of knowledge.

——————————

Hayley Price has a background in sociology, international development studies, and education. She recently completed her Masters degree in Sociology and Equity Studies in Education at the University of Toronto.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.

A longer version is cross-posted at Montclair SocioBlog.

Long before the Freakonomics guys hit the best seller list by casting their economic net in sociological waters, there was Gary Becker.  If you want to explain why people (some people) commit crimes or get married and have babies, Becker argued, just assume that people are economically rational.  Follow the money and look at the bottom line.  You don’t need concepts like culture or socialization, which in any case are vague and hard to measure.*

Becker wrote no best-sellers, but he did win a Nobel.  His acceptance speech: “The Economic Way of Looking at Behavior.”

In a Wall Street Journal op-ed Friday about the recession, Becker started off Labor Day weekend weighing in on unemployment and the stalled recovery.  His explanation: in a word, uncertainty.

These laws [financial regulation, consumer protection] and the continuing calls for additional regulations and taxes have broadened the uncertainty about the economic environment facing businesses and consumers. This uncertainty decreased the incentives to invest in long-lived producer and consumer goods. Particularly discouraged was the creation of small businesses, which are a major source of new hires.

There’s something curious about this.  Becker pushes uncertainty to the front of the line-up and says not a word about the usual economic suspects – sales, costs, customers, demand.  It’s all about the psychology of those in small business, their perceptions and feelings of uncertainty.  Not only are these vague and hard to measure, but as far as I know, we do not have any real data about them.  Becker provides no references.  The closest thing I could find was a small business survey from last year, and it showed that people in small business were far more worried about too little demand than about too much regulation.

Compared with Regulation, twice as many cited Sales as the number one problem.  (My posts on uncertainty from earlier this summer are here and here.)

In addition, the sectors of the economy that should be most uncertain about regulation – finance, mining and fuel extraction, and medical care – are those where unemployment is lowest.

More, as David Weidner writes in the Wall Street Journal, taxes, interest rates, and regulation at an all-time low.

[The uncertainty-about-taxes-and-regulation argument] would make more sense if, say, taxes were already high and might be going higher or regulatory burdens were heavy and might be getting heavier. But when taxes are at a 60-year low and the regulations are pretty much the same as they were in the 1990s boom, the argument makes no sense at all (Mark Thoma quoting an e-mail from Gary Burtless).

If it’s really uncertainty caused by these things that causes a reluctance to hire, the time to invest and hire should be now.

—————————

* This is an oversimplified version, but it will do for present purposes.

While America has taken great steps in recent decades toward gender equality, this progress seems lacking in politics. No elected legislative body in the U.S. has ever come close to being half female—the proportion we would expect if it were truly representative of the populace. R.W. Connell argues patriarchy is replicated and reinforced partially through our individual gender practices that cumulatively make social institutions operate.  In daily life, all men and women are socially pressured to embody the gender traits prescribed for their sex.

Kathleen Hall Jameson argues the ways we judge others’ masculine and feminine selves creates a double bind dilemma for women in leadership; a problem that is especially salient in politics, where winning is contingent upon candidates being both personally liked and thought of as competent leaders.  Men have no problem being respected both personally and as leaders because acting strong, confident, and in-charge is expected of both males and authority figures.  However, when women present themselves as leaders by acting dominant, they are likely to be judged as overly harsh, or even “bitchy.”  Yet when women act feminine, they are often judged as unfit for authority because they lack leadership qualities.  In electoral politics, it is very difficult for women to walk the tightrope between being a competent leader and also connecting with voters personally.

We can see the double-bind at work in Saturday Night Live’s now famous, or infamous, parodies of Hillary Clinton and Sarah Palin during the 2008 Presidential campaigns.  Tina Fey’s Grammy-winning depiction of Sarah Palin exaggerates femininity, often portraying the former Alaska governor as if she is competing in a beauty pageant.  Amy Poehler’s masculine portrayal of Hillary Clinton as overly-aggressive, combative, and filled with anger exemplifies the other side of the double bind dilemma.  One skit bringing these characters together to speak out against sexism in the campaign is especially revealing:

Fey presents Palin as accommodating, saying “I was so excited when I was told Senator Clinton and I would be addressing you tonight,” to which Poehler-as-Clinton uncooperatively says, “I was told I would be addressing you alone.” Similarly, a capitulating Palin says “Hillary and I don’t agree on everything,” to which Clinton combats “we don’t agree on anything.” Later in the skit, Poehler-as-Clinton takes firm policy stances while Fey-as-Palin gives ‘pageant’ answers. After Clinton speaks out against the Bush Doctrine, Fey as Palin claims “I don’t know what that is.” Clinton says “I believe diplomacy should be the cornerstone of any foreign policy;” Palin responds “and I can see Russia from my house.” In the SNL skit, Palin tells political pundits to quit using words “that diminish us like pretty, attractive, beautiful …” while Clinton interrupts, “harpy, shrew, boner-shrinker.” Throughout the skit Clinton becomes increasingly agitated and then rips apart the podium in anger. Poehler’s masculine portrayal becomes literal when she says “I invite the media to grow a pair, and if you can’t, I will lend you mine.”

The overly-effeminate portrayal of Palin reflects one side of the double-bind where many people judge feminine women as lacking the appropriate characteristics for leadership. On the other side of the double-bind, the unfeminine portrayal of Clinton illustrates how women who act powerful and confident are subject to character attacks. However, because leadership qualities are expected of men, male politicians are not subject to this critique when they act like leaders. For example, Poehler as Clinton describes her “road to the White House” as “I scratched, and I clawed,”—words with negative connotations which would never be used to describe competitive men with ambition.

While political comedy depicting our leaders as inept has been a mainstay of our electoral process since our country’s founding; we should be cognizant that parodies of female politicians often draw upon very real aspects of gender that make it difficult for women to achieve positions of leadership.

———-

Jason Eastman is an Assistant Professor of Sociology at Coastal Carolina University who researches how culture and identity influence social inequalities.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.

Presidential hopeful and U.S. Congressman Ron Paul (R-TX) made the news over the weekend arguing, among other things, that the Federal Emergency Management Agency (FEMA) is unnecessary or, even worse, creates a kind of moral hazard in populations who come to depend on Federal relief efforts. In remarks reported Friday, Rep. Paul said that Hurricane Irene should be handled “like 1900,” the year that a large storm killed approximately 8,000 individuals in Galveston and a few thousand more onshore, when it struck the low-lying island and nearby small communities on the Texas coast.

It is certainly true that the Federal response to the destruction of Galveston was relatively minor. Systematic Federal management and provision of aid to individuals in disaster crystallized in response to the Mississippi River’s catastrophic flooding in 1927.  In 1900, it was limited for the most part to President McKinley sending surplus Army tents to house the newly homeless residents of Galveston, and loaning some ships to transport relief goods.

The nation as a whole, on the other hand, quickly mobilized relief donation efforts through newspapers, state and city governments, and the dense network of fraternal organizations that characterized American civil society in 1900. The nation’s response was along the lines of the civic and political institutions of the time, with all that entailed.

[Credit: Rosenberg Library’s Galveston and Texas History Center archives]

So, for instance, some of the citizens of Galveston who survived the storm were given liquor for their nerves and pressed into service at gunpoint by local authorities to clear dead and putrefying bodies from the wreckage; some were later partially compensated for their time with a small sum of money. Property owners, however, were exempted from mandatory clearing of debris and corpses.

Voluntary associations – often segregated by gender, race, ethnicity, and class – took care of their own members as best they could, but the broader distribution of relief supplies arriving from other areas was handled by committees of Galveston’s social and economic elites, based on their knowledge of their city’s political districts. Distribution efforts throughout the Texas coast were controversial enough that hearings were held by the Texas State Senate to investigate reports of improper relief distribution, some of which were borne out by testimony but none of which were pursued.  Survivors’ letters suggest that in some cases the nicer relief goods – the distribution of which was handled by committees of Galveston’s social and economic elites on the basis of what they knew about their city’s political districts – went to the wealthier victims’ districts, when they weren’t re-routed by less wealthy and somewhat disgruntled Galvestonians tasked with actually lugging the supplies around the city.  And Galveston’s African-American community was wholly shut out of the rebuilding process and denied a seat on the Central Relief Committee, despite efforts to secure a place in helping shape the collective destiny of the city. This is hardly surprising: poorer Americans tend to suffer disproportionately in most disasters, and are often left out of planning and rebuilding efforts.

There is much to be said for the response of Galveston’s Central Relief Committee. Under their leadership the city built the seawall that helps protect the city to this day and they initiated a series of successful municipal reforms that became widespread during the Progressive era. But we should not let unexamined nostalgia blind us to the realities of the situation in Galveston in the months after the 1900 storm.

Nor should we forget that the techniques that might have been more or less appropriate in 1900 were attuned to a society that has since changed quite a bit. It would be hard to imagine contemporary Americans pressed into service to clear bodies, barring a truly exceptional event. And despite its shortcomings, American culture is on the whole more egalitarian in 2005 than it was in 1900.

But the dense network of associations through which much assistance flowed to the city simply does not exist in the contemporary U.S. for a variety of reasons, none of which are reducible to the growth of the Federal government.  Instead, Americans support each other in crises by way of donations to highly professionalized and technically adept disaster relief organizations like the Red Cross, and by maintaining government organizations charged with preparing for the worst disasters and catastrophes with their tax dollars.

This makes sense in part because contemporary cities and the economic arrangements which undergird them are much more complex beasts than they were in 1900. The following chart property damage and deaths caused by major disasters over the 20th century:

[Source: The Federal Response to Hurricane Katrina: Lessons Learned, p. 6.]

The overall trend is toward less lethal but much costlier disasters, which in turn causes significant disruptions to the ordinary functioning of local businesses and municipal governments that depend on tax revenues from those businesses. This necessitates more Federal involvement, as cities and state governments struggle to get their own houses in order, and to pay for the resources and technical know-how needed to rebuild infrastructure, modern dwellings, and businesses. As Lawrence Powell, a historian at Tulane University in New Orleans, asked of the influx of well-meaning volunteers in response to Katrina, “Can the methods of a nineteenth-century barn raising drag a twenty-first-century disaster area from the mud and the muck?”.

The 20th century history of Federal disaster policy can be described as a cycle of expansion and contraction. Increasingly complex disasters draw forth ad hoc solutions, which are then formalized and later institutionalized until they grow unwieldy and are periodically consolidated in efforts to provide more efficient, systematic, and effective services that are less prone to fraud or waste.

Small and big business, social movement organizations, academics, professionals, voluntary associations and NGOs have all helped shape the trajectory of that cycle, as when civil rights organizations successfully lobbied Congress and the Red Cross after Hurricane Camille in 1969 to provide a baseline of minimum assistance to hurricane victims, rather than the older policy that granted aid on the basis of pre-disaster assets (and which thus tended to favor wealthier victims on the basis that they had lost more than had the poor).

In recent decades, this has tended toward deregulation of coastal development in deference to free market ideals and a Congressional movement in the mid 1990s that sought to pay for disaster relief by, in large part, cutting social service programs that serve the poor. (See Ted Steinberg’s Acts of God for one good historical and political economic critique of U.S. disaster policy.)

How Federal disaster mitigation efforts can be more efficient, just, or effective is certainly a worthy conversation to hold. How best to arrange – and pay for – social relationships around economic, ecological, and technological risk is also an excellent topic for deliberation and debate. But to seriously argue that we should strive to make our disaster response regime more like that enjoyed by Americans in the early half of the twentieth century is, for lack of a better word, silly.

(For that matter, it’s hard to understand what Rep. Paul means by his call for more control by the States; the decision to request the involvement of the Federal government and FEMA already rests with the State governors, as per the Stafford Act.)

Former generations of Americans saw a patchwork of state government solutions as inadequate to managing modern disasters, particularly those that overwhelm municipal or State governments. They built Civil Defense agencies, the Office of Emergency Preparedness, and later FEMA in an effort to combine accountability and economies of scale and expertise, and to ensure that in times of disaster Americans could count on their Federal government to marshal tools and talent when local and State governments are overwhelmed and help is asked.

And as my own research shows, the efforts of these state organizations have long been understood by victims and outside observers alike as expressing and relying on bonds of fellow citizenship and civil solidarity. That in recent decades this legacy has been tarnished with cronyism and mismanagement from above says more about those political actors and the institutions of American electoral politics than it does about the inherent worth of Federal disaster management organizations.

——————————

Brady Potts is a lecturer in the Department of Sociology at the University of Southern California. His current research focuses on the history of public discourse and narratives around risk and hurricane disasters, and the role of civic culture in American disaster response.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.


Duff sent in a video showing candidates from the 2011 Miss USA contest answering the question, “Should evolution be taught in schools?” Their answers are a great example of the normalization of the idea that evolution is “one side” of a story, with religion being the other side, and that we should just choose between these two stories based on what we’re most comfortable with personally:

There’s a striking discourse here of allowing children (or, by extension, their parents) to “choose” whether to learn about evolution or whether it’s a perspective they like, in a way we don’t apply to other scientific theories. I suspect if you allowed students to choose, they might, just perhaps, decide that calculus, grammatical rules, and the laws of physics aren’t things they happen to feel like learning, a fact that most curriculum review committees see as rather irrelevant.

This discourse of choice works, in part, because of the word “theory.” In popular usage, “theory” is often used as though it’s interchangeable with “idea” or “opinion” or “random thought I just made up in my head right now.” Of course, scientists use the word in a very different way, and the scientific process is to test theories and find evidence for or against them. But the conflation of “theory” in the scientific sense with “opinion” in the public-usage sense facilitates the discourse of choice.

I suspect that some watching the video will see this as little more than an example of air-headed, dumb women not understanding science. But it’s important to remember that these women are carefully prepped for this competition; they have been through years of lower-level beauty pageant competitions and, to get to the Miss USA contest, they’ve clearly learned the rules of the beauty pageant circuit. They may or may not personally completely agree with what they’re saying; the point is to provide an answer that they believe is most likely to appeal to a group of judges who are looking for a candidate who will be palatable to a broad audience and unlikely to stir controversy. Whatever their personal opinions might be, the women are providing an answer based on a perception of what the most acceptable response is — and the discourse of choice is sufficiently normalized to be a viable, and perhaps the only viable, option they can give and hope to win.

And, if you’re interested, here’s a parody video asking if math should be taught in schools:

Gwen Sharp is an associate professor of sociology at Nevada State College. You can follow her on Twitter at @gwensharpnv.