organizations/institutions

In this three-minute clip, sociologist Shelley Correll discusses her research on the “motherhood penalty.”  The phrase refers to the finding that being a mom specifically, not just being female or being a parent, leads to lower income. Scholars have begun to realize just how significant this is. As Correll explains, the pay gap between women with and without children is larger than that between women and men:

For more, see the full text of Correll’s paper titled “Getting a Job: Is There a Motherhood Penalty.”

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at Reports from the Economic Front.

“Too big to fail” — that was the common explanation voiced at the start of the Great Recession for why the Federal Reserve had no choice but to channel trillions of dollars into the coffers of our leading banks. But, the government also pledged that once the crisis was over it would take steps to make sure we would never face such a situation again.  

The chart below shows the growing concentration of bank assets in the hands of the top 3 U.S. banks. The process really took off starting in the late 1990s and never slowed down right up to the crisis.  It was the reality of the top three banks controlling over 40 percent of total bank assets that gave meaning to the “too big to fail” fears.    

nature09659-f52.jpg

But what has happened since the crisis?  According to Bloomberg Businessweek, the largest banks have only gotten bigger:

Five banks — JPMorgan Chase, Bank of America, Citigroup, Wells Fargo, and Goldman Sachs — held more than $8.5 trillion in assets at the end of 2011, equal to 56 percent of the U.S. economy, according to the Federal Reserve. That’s up from 43 percent five years earlier.

The Big Five today are about twice as large as they were a decade ago relative to the economy, meaning trouble at a major bank would leave the government with the same Hobson’s choice it faced in 2008: let a big bank collapse and perhaps wreck the entire economy or inflame public ire with a costly bailout. “Market participants believe that nothing has changed, that too-big-to-fail is fully intact,” says Gary Stern, former president of the Federal Reserve Bank of Minneapolis.

pol_banks17_inline4051.jpg

Not surprisingly, this kind of economic dominance translates into political power.  For example, the U.S. financial sector is leading the charge for new free trade agreements that promote the deregulation and liberalization of financial sectors throughout the world.  Such agreements will increase their profits but at the cost of economic stability; a trade-off that they apparently find acceptable.

The recently concluded U.S.-Korea Free Trade Agreement is a case in point.  Leading financial firms helped shape the negotiating process.  As a consequence, Citigroup’s Laura Lane, corporate co-chair of the U.S.-Korea FTA Business Coalition, was able to declare that the agreement had “the best financial services chapter negotiated in a free trade agreement to date.”  Among other things, the chapter restricts the ability of governments to limit the size of foreign financial service firms or covered financial activities.  This means that governments would be unable to ensure that financial institutions do not grow “too big to fail” or place limits on speculative activities such as derivative trading.  The chapter also outlaws the use of capital controls.

These same firms are now hard at work shaping the Transpacific Partnership FTA, a new agreement with a similar financial service chapter that includes eight other countries.  Significantly, although the U.S. Trade Representative has refused to share any details on the various chapters being negotiated with either the public or members of Congress, over 600 representatives from U.S. multinational corporations do have access to the texts, allowing them to steer the negotiations in their favor.

The economy may be failing to create jobs but leading financial firms certainly don’t seem to have any reason to complain.

Mother Jones magazine offers some comparisons. Highlights:

  • Its net sales is greater than the GDP of Norway.
  • Its entertainment sales is triple that of Hollywood.
  • It emits more CO2 than the 50 lowest-emitting countries together.
  • It employs a workforce the size of the population of the 50 smallest countries in the world.
  • Its square-footage exceeds that of the island of Manhattan.

The data:


Via SocProf.

Lisa Wade, PhD is an Associate Professor at Tulane University. She is the author of American Hookup, a book about college sexual culture; a textbook about gender; and a forthcoming introductory text: Terrible Magnificent Sociology. You can follow her on Twitter and Instagram.

Cross-posted at Sociology in Focus.

Steve Jobs, co-founder of Apple, died this week. I didn’t know him and yet his death moved me deeply. It shook me awake. When I woke up sad the next morning I did the only thing I know how to do, I thought about Steve Jobs and his passing sociologically.

Of all the things you could say about Steve Jobs, without a doubt, one of them was that he was a great charismatic leader. During his presentations his words, energy, and style could create a “reality distortion field” that would make mundane aspects of his products sound revolutionary. His spirit worked almost like a Jedi mind trick telling reporters what they were to write in their reviews. His charisma seemed superhuman.

(source)

A charismatic authority figure is one of three styles of authority that Max Weber talked about. Authority can be thought of as the use of power that is perceived as legitimate. Some statuses have power simply because of tradition (e.g. parents have power over children). Other statuses have power because they have been “routinized” or built in the structure of social institutions. Weber calls this type of authority rational-legal authority and the president of the United States is a good example of this type.

Charismatic authority can be thought of as the the use of power that is legitimized by the exemplary characteristics of a person or by their accomplishments that inspire others to follow or be loyal to them. Steve Jobs accomplishments have gained him a rabid fan base; to the point that Apple fans are oft referred to as members of the “Cult of Mac”. It was because of who Jobs is (or at least how he was perceived) that many people admired, respected, and followed his work.

The problem for Apple is that any organization that gains its authority because they have a charismatic leader must eventually deal with the loss of that leader. How can you hold on to your authority and legitimacy with the charismatic figure gone? You have to build the revolutionary ideas and practices of the figure into the bureaucracy or formal structure of the organization. Weber called this process of transferring authority from a charismatic person to a bureaucratic organization the “Routinization of Charisma”. In the corporate world they call this process a “succession plan.”

When Jobs resigned on August 24th Tim Cook succeeded him and became the CEO. A few weeks later on October 4 Cook took the stage for the first time to lead Apple’s announcement of the iPhone 4S. The announcement was nearly identical in form to the announcements led by Jobs. During the announcement Cook said multiple times, “There is a lot of momentum here at Apple” which could be interpreted sociologically as, “nothing has changed; we still deserve the authority our previous leader gained through his charisma.” The entire announcement was almost identical to the announcements except many viewers noted that Tim Cook did not have the charisma of Steve Jobs.

Jobs was a master at getting the media to write the headlines he wanted, but after this weeks talk ABC’s headline read “Apple Unveils Anti-Climatic iPhone 4S.” Anti-climatic!?!  Comedians ripped Cook for his poor stage presence in a video. Traders showed their disapproval as Apple’s stock price dropped a half percent after the announcement. I’m not trying to pile on here, I’m just pointing out that transitioning from a charismatic authority figure to less charismatic figure is hard; or as Weber would say it, the “routinization of charisma” is difficult if not impossible.

Now that he is gone, I’m sad because I enjoyed so much listening to him speak about his work. He was an artist in so many ways and I’m sad I won’t get to see anymore of his work. Rest in peace Mr. Jobs.

——————————

Nathan Palmer is a visiting lecturer at Georgia Southern University. He is a passionate educator, the founder of Sociology Source, and the editor of Sociology in Focus.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.

The following chart featured at The Economist illustrates that women in Europe expect to earn significantly less than men after graduating from university. (Of course, women’s expectations are represented in pink, and men’s in blue.) According to the study, European women attending the most prestigious universities expect to earn an average of 21 per cent less than their male counterparts.

Given that women actually do earn an average of 17.5 per cent less than men in the European Union, this difference in salary expectations might not seem shocking. What’s interesting, though, is the accompanying text that attempts to explain these disparities:

Women and men seem to differ in workplace and career aspirations, which may explain why salary expectations differ.  Men generally placed more importance on being a leader or manager than women (34% of men versus 22% of women), and want jobs with high levels of responsibility (25% v 17%). Women, however want to work for a company with high corporate social responsibility and ethical standards; men are more interested in prestige (31% v 24%).

By neglecting to address how our social environment can contribute to reported differences in career aspirations, statements like these risk reinforcing gender stereotypes and naturalizing salary inequalities. Can we really assume that gendered salary disparities are due to women’s innately lower inclination to pursue high-paying career paths?

Research says: no, we can’t. As Cordelia Fine writes in her book Delusions of Gender, countless studies have demonstrated that social factors such as prevalent beliefs about gender differences and male-dominated work environments influence women’s responses to questions about their abilities and aspirations. For example, women exposed to media articles claiming that successful careers in entrepreneurship require typically “masculine” qualities were less likely to report an interest in becoming entrepreneurs. Women who knew that the test they were taking was measuring gender differences were more likely to report being highly empathic. Women were less interested in attending an engineers’ conference when it was advertised as male-dominated rather than gender-balanced.

Our perceptions of our abilities, identities, and sense of belonging are influenced by our social environment. If, as this graph shows, women attending the most prestigious universities in Europe aspire to different career paths than men, this fact can’t be taken for granted; addressing this inequality requires an analysis of its own.

Thanks to Dmitriy T.M. for sending in this graph!

Reference: Fine, C. (2010). Delusions of Gender: How Our Minds, Society, and Neurosexism Create Difference. New York: W.W. Norton & Company, Inc.

——————————

Hayley Price has a background in sociology, international development studies, and education. She recently completed her Masters degree in Sociology and Equity Studies in Education at the University of Toronto.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.

Presidential hopeful and U.S. Congressman Ron Paul (R-TX) made the news over the weekend arguing, among other things, that the Federal Emergency Management Agency (FEMA) is unnecessary or, even worse, creates a kind of moral hazard in populations who come to depend on Federal relief efforts. In remarks reported Friday, Rep. Paul said that Hurricane Irene should be handled “like 1900,” the year that a large storm killed approximately 8,000 individuals in Galveston and a few thousand more onshore, when it struck the low-lying island and nearby small communities on the Texas coast.

It is certainly true that the Federal response to the destruction of Galveston was relatively minor. Systematic Federal management and provision of aid to individuals in disaster crystallized in response to the Mississippi River’s catastrophic flooding in 1927.  In 1900, it was limited for the most part to President McKinley sending surplus Army tents to house the newly homeless residents of Galveston, and loaning some ships to transport relief goods.

The nation as a whole, on the other hand, quickly mobilized relief donation efforts through newspapers, state and city governments, and the dense network of fraternal organizations that characterized American civil society in 1900. The nation’s response was along the lines of the civic and political institutions of the time, with all that entailed.

[Credit: Rosenberg Library’s Galveston and Texas History Center archives]

So, for instance, some of the citizens of Galveston who survived the storm were given liquor for their nerves and pressed into service at gunpoint by local authorities to clear dead and putrefying bodies from the wreckage; some were later partially compensated for their time with a small sum of money. Property owners, however, were exempted from mandatory clearing of debris and corpses.

Voluntary associations – often segregated by gender, race, ethnicity, and class – took care of their own members as best they could, but the broader distribution of relief supplies arriving from other areas was handled by committees of Galveston’s social and economic elites, based on their knowledge of their city’s political districts. Distribution efforts throughout the Texas coast were controversial enough that hearings were held by the Texas State Senate to investigate reports of improper relief distribution, some of which were borne out by testimony but none of which were pursued.  Survivors’ letters suggest that in some cases the nicer relief goods – the distribution of which was handled by committees of Galveston’s social and economic elites on the basis of what they knew about their city’s political districts – went to the wealthier victims’ districts, when they weren’t re-routed by less wealthy and somewhat disgruntled Galvestonians tasked with actually lugging the supplies around the city.  And Galveston’s African-American community was wholly shut out of the rebuilding process and denied a seat on the Central Relief Committee, despite efforts to secure a place in helping shape the collective destiny of the city. This is hardly surprising: poorer Americans tend to suffer disproportionately in most disasters, and are often left out of planning and rebuilding efforts.

There is much to be said for the response of Galveston’s Central Relief Committee. Under their leadership the city built the seawall that helps protect the city to this day and they initiated a series of successful municipal reforms that became widespread during the Progressive era. But we should not let unexamined nostalgia blind us to the realities of the situation in Galveston in the months after the 1900 storm.

Nor should we forget that the techniques that might have been more or less appropriate in 1900 were attuned to a society that has since changed quite a bit. It would be hard to imagine contemporary Americans pressed into service to clear bodies, barring a truly exceptional event. And despite its shortcomings, American culture is on the whole more egalitarian in 2005 than it was in 1900.

But the dense network of associations through which much assistance flowed to the city simply does not exist in the contemporary U.S. for a variety of reasons, none of which are reducible to the growth of the Federal government.  Instead, Americans support each other in crises by way of donations to highly professionalized and technically adept disaster relief organizations like the Red Cross, and by maintaining government organizations charged with preparing for the worst disasters and catastrophes with their tax dollars.

This makes sense in part because contemporary cities and the economic arrangements which undergird them are much more complex beasts than they were in 1900. The following chart property damage and deaths caused by major disasters over the 20th century:

[Source: The Federal Response to Hurricane Katrina: Lessons Learned, p. 6.]

The overall trend is toward less lethal but much costlier disasters, which in turn causes significant disruptions to the ordinary functioning of local businesses and municipal governments that depend on tax revenues from those businesses. This necessitates more Federal involvement, as cities and state governments struggle to get their own houses in order, and to pay for the resources and technical know-how needed to rebuild infrastructure, modern dwellings, and businesses. As Lawrence Powell, a historian at Tulane University in New Orleans, asked of the influx of well-meaning volunteers in response to Katrina, “Can the methods of a nineteenth-century barn raising drag a twenty-first-century disaster area from the mud and the muck?”.

The 20th century history of Federal disaster policy can be described as a cycle of expansion and contraction. Increasingly complex disasters draw forth ad hoc solutions, which are then formalized and later institutionalized until they grow unwieldy and are periodically consolidated in efforts to provide more efficient, systematic, and effective services that are less prone to fraud or waste.

Small and big business, social movement organizations, academics, professionals, voluntary associations and NGOs have all helped shape the trajectory of that cycle, as when civil rights organizations successfully lobbied Congress and the Red Cross after Hurricane Camille in 1969 to provide a baseline of minimum assistance to hurricane victims, rather than the older policy that granted aid on the basis of pre-disaster assets (and which thus tended to favor wealthier victims on the basis that they had lost more than had the poor).

In recent decades, this has tended toward deregulation of coastal development in deference to free market ideals and a Congressional movement in the mid 1990s that sought to pay for disaster relief by, in large part, cutting social service programs that serve the poor. (See Ted Steinberg’s Acts of God for one good historical and political economic critique of U.S. disaster policy.)

How Federal disaster mitigation efforts can be more efficient, just, or effective is certainly a worthy conversation to hold. How best to arrange – and pay for – social relationships around economic, ecological, and technological risk is also an excellent topic for deliberation and debate. But to seriously argue that we should strive to make our disaster response regime more like that enjoyed by Americans in the early half of the twentieth century is, for lack of a better word, silly.

(For that matter, it’s hard to understand what Rep. Paul means by his call for more control by the States; the decision to request the involvement of the Federal government and FEMA already rests with the State governors, as per the Stafford Act.)

Former generations of Americans saw a patchwork of state government solutions as inadequate to managing modern disasters, particularly those that overwhelm municipal or State governments. They built Civil Defense agencies, the Office of Emergency Preparedness, and later FEMA in an effort to combine accountability and economies of scale and expertise, and to ensure that in times of disaster Americans could count on their Federal government to marshal tools and talent when local and State governments are overwhelmed and help is asked.

And as my own research shows, the efforts of these state organizations have long been understood by victims and outside observers alike as expressing and relying on bonds of fellow citizenship and civil solidarity. That in recent decades this legacy has been tarnished with cronyism and mismanagement from above says more about those political actors and the institutions of American electoral politics than it does about the inherent worth of Federal disaster management organizations.

——————————

Brady Potts is a lecturer in the Department of Sociology at the University of Southern California. His current research focuses on the history of public discourse and narratives around risk and hurricane disasters, and the role of civic culture in American disaster response.

If you would like to write a post for Sociological Images, please see our Guidelines for Guest Bloggers.

Cross-posted at Caroline Heldman’s blog.

News media are comparing Hurricane Irene to Hurricane Katrina in ways that allow us to forget that Hurricane Katrina was a humanmade disaster, but in one way, these events are similar – prisoner evacuation. New Orleans officials chose not to evacuate 7,000 inmates, some of whom were trapped in flooded cells and later left on a bridge for days without food and water, as detailed in this post.  Officials in New York have made the same decision with Hurricane Irene.

Elizabeth Furth, a former student who has participated in rebuilding efforts in New Orleans, sent in this map showing that Rikers Island is not part of the City’s evacuation plan:

Riker’s Island is the unzoned white blob in this close up:

Mayor Bloomberg announced that Riker’s Island would not be evacuated at a recent press conference, despite the fact that the island is surrounded by areas with the second highest evacuation rating (Zone B).  Other New York islands on the map are in Zone A (mandatory evacuation) or Zone B, but Riker’s has no evacuating rating, perhaps because the Department of Corrections doesn’t have an evacuation plan.  According to the New York Times blog, “no hypothetical evacuation plan for the roughly 12,000 inmates that the facility may house on a given day even exists. Contingencies do exist for smaller-scale relocations from one facility to another.”

Solitary Watch reports that Rikers Island was built on landfill, which is especially vulnerable to disasters. Rikers Island may weather Hurricane Irene without incident, but this disaster has again revealed how prisoners are considered disposable in times of crisis.

 

Cross-posted from Family Inequality.

The Supreme Court’s decision in the Dukes v. Wal-Mart case, Justice Scalia acknowledged that Wal-Mart’s many local managers had a lot of discretion in their personnel decisions, even though the company had a written policy against gender discrimination (who doesn’t?). But he gave the company credit for a vague policy and let it off the hook for a systematic pattern of disparity between men and women. So, when does a toothless, vague policy with wide discretion lead to a bad outcome, and is failing to prevent it the same as causing it?

A path-breaking sociological analysis of organizational affirmative action outcomes has shown that the companies that successfully diversify their management are most likely to have policies with teeth – where accountability is built into the diversity goal. In light of the Wal-Mart case, this led to a rollicking debate about how to think about “corporate culture” versus policies, and when to blame whom, legally or otherwise – which even divided sociologists.

Smoking in the movies

Here’s an interesting, at-least-vaguely related case. Positive depictions of smoking in the movies are widely understood to be harmful. Yet, smoking is also glamorous, artistic, and popular – representing both anti-adult rebellion and maturity. So, what to do? The Centers for Disease Control, in the always-riveting Morbidity and Mortality Weekly Report, has published a fascinating report on this topic. They report the number of tobacco incidents* in top-grossing, youth-rated (G, PG, PG-13) movies, and divide them between those that implemented an anti-tobacco policy and those that didn’t — helpfully cutting the movie industry roughly in half — and provide a simple before-and-after tabulation:

From 2005 to 2010, among the three major motion picture companies (half of the six members of the Motion Picture Association of America [MPAA]) with policies aimed at reducing tobacco use in their movies, the number of tobacco incidents per youth-rated movie decreased 95.8%, from an average of 23.1 incidents per movie to an average of 1.0 incident. For independent companies (which are not MPAA members) and the three MPAA members with no antitobacco policies, tobacco incidents decreased 41.7%, from an average of 17.9 incidents per youth-rated movie in 2005 to 10.4 in 2010, a 10-fold higher rate than the rate for the companies with policies. Among the three companies with antitobacco policies, 88.2% of their top-grossing movies had no tobacco incidents, compared with 57.4% of movies among companies without policies.

The difference is dramatic, as indicated by this image about the images. (Because I turned the columns into cigarettes, this is not just a graph, but an infographic):

 

The policies provide what may be an ideal mix of accountability and responsibility, short of a simplistic ban.

[The policies] provide for review of scripts, story boards, daily footage, rough cuts, and the final edited film by managers in each studio with the authority to implement the policies. However, although the three companies have eliminated depictions of tobacco use almost entirely from their G, PG, and PG-13 movies, as of June 2011 none of the three policies completely banned smoking or other tobacco imagery in the youth-rated films that they produced or distributed.

Maybe this formula is effective because there already has been a strong cultural shift against smoking — as strong, even, as the shift against excluding women from management positions?

Graphic addendum (disturbing image below)

Whether smoking in movies actually encourages young people to take up smoking is of course a not a settled issue — especially on websites sponsored by tobacco sellers, as seen in this ironic screen-shot from Smokers News:

 

One reason to have an explicit policy is that it’s easy to assume viewers will see through the glamour to the negative outcomes. “Surely no one will want to be like that character…” But people – maybe especially young people? – have an amazing capacity to celebrate selectively from the characters they see. I have learned from experience that, in children’s stories, even those who get their comeuppance in the end still manage to emerge as role models for their bad behavior. So maybe some people want to relive this from Pulp Fiction…

…and aren’t put off by this:

—————————–

* “A new incident occurred each time 1) a tobacco product went off screen and then back on screen, 2) a different actor was shown with a tobacco product, or 3) a scene changed, and the new scene contained the use or implied off-screen use of a tobacco product.”