In solutions journalism, there are four ways a response to a problem can become a good story.
- It's a success. Most solutions journalism stories are chosen because something about the response works. The evidence is very clear.
- It's a cool new idea that looks promising. Even if we can't know if a new idea will work on a wide scale, there may be smaller indications - experts love the idea, or it works in a lab or animals (caution here: as we know, curing cancer in mice means we can cure cancer in mice), or it worked in a small pilot study but hasn't been tested on a wider scale. In all these cases, you've got to acknowledge what we know and don't know. And if the track record is weak, indicate why you're reporting on it anyway.
- It's coming here. For example, your city is building an urban bike path to encourage exercise. You'll want to go to another place that's done this and see what about it has worked and what hasn't. That's a story, whether it's a success or not. Better yet, find one that's worked and one that hasn't - and look at what made the difference.
- It's a solution people are talking about, but the discussion could use some rigor. For example: there's lots of hype about health apps - do they really work? Or because apps are all different, here's a better idea: "Health apps: when do they work and when do they fail?"
In all these cases, you'll need to rely on evidence to determine what's working. For health reporters, this is familiar territory. We find and evaluate evidence to decide how to report on clinical studies, drug trials, and rating systems for hospitals and doctors, among many other topics.
The process is similar for solutions stories. A program claiming to reduce infant mortality by 38 percent over five years - how do you evaluate that claim?
It is no more difficult to vet a solutions story than a problem-focused story. What's different is the perceived consequence of getting it wrong. Get a problem story wrong and that's a journalistic misdemeanor. Get a solutions story wrong and that's a felony. That's not how the outside world sees it. But to journalists, there's nothing worse than looking gullible.
The most important step towards getting it right is to be conservative in your language. Don't over-claim. Just report what the evidence says. Nothing is perfect, and the more we reflect this reality, the more our audience will trust us.
But "what the evidence says" can be a complex question. For example, Leapfrog, Consumer Reports, HealthGrades and U.S. News & World Report - plus CMS Hospital Compare - all publish ratings of hospitals. Not a single hospital scores highly in all the rankings. That's because they all weight and value different things. Leapfrog, for example, measures how well hospitals keep patients from harm, while HealthGrades looks at death or complications outcomes from 27 different clinical procedures. We all know to be wary of self-reported claims of success on an organization's website. But the conclusions of research papers in prestigious peer-reviewed academic journals can also be misleading or wrong - and many of these studies can't be replicated.
- Consider the source. Is the data published and where? Who's measuring the outcome? Is it a source you trust?
- Understand the statistics. You should have a basic grasp of the data the researchers collected and how they manipulated it. If you don't speak statistics, try Khan Academy.
- Look at what's being measured. Some ways this can mislead: A hospital or doctor's office may get very high marks from patients - but that's likely to reflect waiting times and staff courtesy, not health outcomes. Some researchers will report success among a subgroup - but fail to report that the intervention had no effect on the total group.
- Be mindful of the pressures to report success. Researchers have no incentive to discover that something doesn't work, and plenty of incentive to find that it does. This creates bias - conscious or unconscious - towards interpreting results in the best possible light.
- Cui bono? "Who gains?" is always a key question. One example: some organizations that rate health care providers make money from the providers they rate.
- Seek independent expert opinion. Ask experts not involved in the program if the numbers make sense.