When disasters strike—an airline crash, a tornado or flood with huge casualties, a mass shooting—official investigations and reports follow as predictably as shock, mourning, and efforts to recover. A staple of crisis management and emergency response is the post-response report, often known as an “after action report” or “lessons learned” document. But how often do those reports or the processes that produce them lead to real learning and generate meaningful findings that can help people prevent or cope with future disasters? Not very often, the best evidence suggests, because reports are often generated in quick response to the urge to “do something” and consist largely of political rhetoric. A close look at post-disaster reports reveals why most embody very little true learning—and suggests what it would take for officials and citizens to conduct more effective investigations to inform useful policy changes.

Typical Responses to Accidents and Disasters

Official responses to disasters tend to fall into one of five broad patterns:

  • A disastrous event happens, and then policies are abruptly changed with little or no effort devoted to learning from the event. A major example is the hastily-enacted USA Patriot Act, passed right after September 11, 2001 without any real analysis.
  • After a disaster, an investigation is undertaken that is obviously incomplete, and the resulting report states the obvious or serves to cover official tracks without any evidence of a serious attempt to learn about deficiencies. An example is the Lessons Learned from Katrina report issued by the Executive Office of the President under George W. Bush.
  • An investigation carried through after a disaster is accompanied or followed by policy changes, but the changes bear little or no relationship to the recommendations of the investigators. For example, Department of Homeland Security was launched two days before the September 11 commission was established.
  • A disastrous event happens, and a thorough and careful investigation is initiated, but policy change does not result. Failure to act may be due to bureaucratic delays, worries about costs, political opposition or some other typical reason for policy stasis. Some aviation safety investigations fall into this category.
  • After a disaster occurs, a thorough investigation is initiated, which leads to policy change as a result of careful review and assessment of relevant facts and wisely designed responses to actionable deficiencies. For example, the Columbia Accident Investigation Board that probed the 2003 space shuttle accident issued a report that led the National Aeronautics and Space Administration to require much closer inspections of heat shields and take steps to prevent damage to shuttle wings that could occur due to foam debris falling from external fuel tanks.

Only the final route of these five embodies the ideal of policy learning following a thorough investigation of a catastrophe and the underlying conditions that allowed routine events to take a disastrous turn. The fourth route refers to potentially fruitful post-disaster investigations whose recommendations fall prey to garden-variety bureaucratic delay. But the first three patterns—unfortunately quite common—do not really include considered action taken after informative investigations. Official reports issued in these scenarios are what I call “fantasy learning documents,” for the same reason that sociologist Lee Clark labels many pre-disaster plans “fantasy documents”—because such documents are created and disseminated for largely rhetorical purposes.

The Pitfall of Fantasy Learning

Disasters focus public attention and open windows for policy changes to be pushed through. Because humans want to understand and learn from damaging events, investigations can, in principal, lead to changes based on improved understandings of the social or natural forces that contributed to disaster. But, unfortunately, post-disaster scenarios also allow interested actors to argue and mobilize on behalf of preexisting interests and goals, especially if investigations and responses happen so fast that they cannot truly be based on new understandings or facts.

What is presented as fruitful policy learning may not be learning at all. The “lessons” promulgated may not be related to the disastrous event, but may express longstanding superstitions or pre-existing goals. Influential actors may take the occasion of the new disaster to push their preferred policies as a way to “do something”—as when Vice President Dick Cheney and his allies used September 11, 2001 as an occasion to push for a U.S. invasion of Iraq, even though that country had nothing to do with the terrorist attacks that day. Similarly, after incidents of mass gun violence, all kinds of longstanding causes are championed—ranging from mental health spending to critiques of popular culture.

Things can easily go awry because mixed motives drive real or fantasy learning in the policy process. People may truly want to prevent disasters from recurring, but individual and group self-interests are also well-known motivators. Consequently, “lessons learned” often point to solutions in the interests of those who investigated—as when engineers call for infrastructure spending. Other sources of distortion flow from human tendencies to favor simple explanations or deflect blame from the powerful—for example, by stressing that the Katrina disaster was due to “poor decisions” by impoverished people to live in parts of New Orleans below sea level.

Photo by USDAgov via Flickr CC
Photo by USDAgov via Flickr.com CC

Doing Better

After disasters—and indeed all of the time—the challenge for democracies is to translate public pressure for realistic remedies into effective systems for learning and improvement. Knowing how easily official investigations can go awry and reports devolve into mere fantasy rhetoric can help us do better. Real learning can happen even in the rush of events after a disaster, if we structure investigations and policy deliberations to weed out predictable pitfalls and diversions.

Read more in Thomas A. Birkland, “Disasters, Lessons Learned, and Fantasy Documents.” Journal of  Contingencies and Crisis Management 17, no. 3 (September 2009): 146-156.

Thomas A. Birkland is in the School of Public and International Affairs at North Carolina State University. He studies focusing events in the public policy process.