Photo via Marina Noordegraaf CC.

Social media feeds are like carnival money booths: we snatch away greedily as the links swirl past, but we’re rarely enriched by the experience. In the rush to process so much so quickly, we’ve become lousy filters for one another – recommending “great articles” that ain’t so great by social science standards.

Many rapidly-circulating stories offer grand assertions but paltry evidence about the social world. It seems silly to direct much intellectual horsepower at every li’l item whooshing past (why, that Upworthy post needs an interrupted time-series design!). So people just hit the “thumbs up” button if they like the sentiment and send it down the line. Passing along such blurbs can seem like a modern equivalent to the kindly/nosy relative who sent us Dear Abby clippings in the newsprint era. Yet there’s a danger to indiscriminate recommendations that can subvert our authority as experts. In my case, I’ve developed a set of policy preferences on crime and economic issues, which I adjust in response to new evidence. If I start endorsing weak studies just because they affirm my preferences or prejudices, then I’d rightly be considered a hack.

As conservatives like to remind progressives — from the comfort of their thin-paned glass houses — there’s a big honking gap between the truth about the world and the truth we’d like to believe about the world. Accordingly, there’s a big honking gap between a “great study” and a “great sentiment” that neatly aligns with our views. And, unlike your kindly/nosy relative, good social scientists have a real responsibility to evaluate the quality of the evidence we cite – especially when we claim to be experts on a matter.

Sometimes we forget that social science provides mighty tools and deep training in evaluating evidence. For example, any good sociologist should have a pretty good sense of whether a given sample is likely to be representative; whether a design is best suited for making causal, descriptive, or interpretive claims; whether to gather data from individuals, groups, or nations in making such claims; and, how to make sense of complex processes that unfold dynamically across all these levels. But while we might closely and carefully scrutinize research methods in our professional work, we seem to get beer goggles whenever a sexy story flits past on Facebook.

When I suspect I might be playing too fast and loose with such stories, I use a three-step approach to consider the evidence:

  1. Restate the central empirical claim (e.g., raising the minimum wage reduces crime)
  2. Identify the theory and evidence cited to support that claim (e.g., a simple plot showing lower crime rates in states with higher minimum wage levels)
  3. Evaluate the design rather than the finding. Is the design so elegant and convincing that I would have believed the results had they gone the other way? Or would I have simply dismissed it as shoddy work? (e.g., a simple plot showing higher crime rates in states with higher minimum wage levels).

Depending on the direction of the wage-crime relationship, my reaction would have changed from “See! This shows I was right all along” to “Bah! These fools didn’t even control for income and poverty rates!” Of course, few of the stories flitting past can withstand the strict scrutiny of a top peer-reviewed journal article. But while I might still circulate them for descriptive or entertainment value, I’m now making fewer unqualified personal recommendations. I’d rather reserve the term “great study” for designs that are so spine-crushingly beautiful that they might actually change my mind on an issue. Researchers know that winning over skeptics is way more fun — and way more important — than preaching to the converted. At TheSocietyPages, this process always animates our board meetings, in lively debates about the research evidence that merits highlighting in our podcasts, citings, TROTS, reading list, and feature sections.

As Clay Shirky says, “It’s not information overload. It’s filter failure.” At TSP, we’ll do our best to screen for solid evidence and big ideas about the social world, in hopes that we can all grab something worthwhile from the information swirl.