I spent the last few days at the National Institute of Justice‘s annual research and evaluation conference, “Evidence-Based Policies and Practices.” The idea is to connect policymakers and practitioners to a broad class of “researchers” studying crime and Justice. Sociologists, even (or especially) public sociologists, tend to be cynical about applied/policy research, but this is one cool conference. A highlight for me was Del Elliott’s plenary address on his “Blueprints” model programs for violence prevention. In some ways, his presentation brought to mind James Coleman’s controversial “Rational Reconstruction of Society” 1992 ASA presidential address, or at least one example of the fruits of Coleman’s programmatic challenge.
Elliott’s group identifies model programs based on classic social science criteria (e.g., randomized trials, sustained effects, independent replication) and then spreads the seed. He argues passionately against sending kids through programs that are known failures (e.g., Scared Straight, early DARE, most boot camps); he even hinted that class-action suits could be filed against courts who continue to do so on grounds of negligence, if not malice aforethought. Mark Lipsey, the master of meta-analysis, explained how monitoring, training, and quality control (or “fidelity,” as they say in the business) can successfully replicate and sustain successful programs. [In evaluation research, it turns out that consistent implementation is just as important as what is being implemented. Most teachers know this; many teaching philosophies can “work,” but the absence of a philosophy or its inconsistent application usually fails.] He also offered evaluation strategies when practitioners go beyond the data –adapting a model program to a new target group or unusual local conditions, for example. Finally, organizations such as the Washington State Institute for Public Policy and individuals such as (RAND pioneer) Peter Greenwood are conducting increasingly sophisticated cost-benefit analyses to distinguish the best from the lousiest societal investments in public safety.
Of course, such social-sciencey attempts to systematize prevention and rehabilitation programs will surely discipline and punish some creative and difficult-to-evaluate efforts. That said, the progress in documenting successful programs has been astounding in the past decade — from the “What Works” report to Congress in the late 1990s to the Campbell Collaboration’s new library of clinical trials. When I received my Ph.D. in 1995, many experts were still arguing “nothing works” in corrections (and, one might add, “so what if it did”). Today, you’d be laughed out of the room if you made such claims. A real scientific basis for programs such as cognitive behavioral therapy and nurse home visits, for example, is now firmly established. A rational reconstruction of criminal Justice, of course, would further require that policymakers attend more consistently to the science. At least we are creating the preconditions for such action — a base of knowledge that simply did not exist in earlier eras.
Comments