I have a few tips on how to read coverage of original research. Please note that this post is aimed primarily at lay individuals who are simply seeking answers to questions that are of interest to them. That said, even those of us who are experts seek answers to questions in fields where we have minimal knowledge.
First, ask yourself as question when reading coverage about a new study: Is this study a new finding or not? If it is new, I advise skepticism. Many novel findings do not replicate. Beware especially of findings that appear to be too good to be true, or too outrageous to be true. Maybe drinking coffee extends one's life. If so, based on the mass quantities of coffee I consume, I will probably live forever. But I wouldn't bet on it.
Next: If you answered no, is this a replication attempt? If it is a replication attempt, then you really have something worth reading. Follow up question: Did the replication attempt succeed or not? Are there multiple replication attempts and is there a consistent pattern? Answers to those questions will give you an idea of whether or not a phenomenon is real. If I find that there are indeed multiple studies that coffee consumption is successfully linked to an increased lifespan, I can rest easily knowing that my pot a day of the thickest, sludgiest coffee imaginable (people blocks away are awakened to the smell of my coffee as it brews!) will allow me to live longer than I might have otherwise.
Finally, if the coverage is not of a replication study but of a meta-analysis, you have some interesting decisions to make. Keep in mind that a meta-analysis is not the last word on the matter. Meta-analyses are only as good as the database and the techniques used to assess the impact of publication bias. Look for any coverage of how the meta-analysts dealt with publication bias. If they simply relied on trim-and-fill analyses, treat any conclusions with caution. If the researchers appear to use methods with names such as PET-PEESE or p-curves, or some battery of methods to detect publication bias, and the effect size still appears to be reliable, then you have a set of findings worth trusting - at least for now. I am not aware of a meta-analysis on coffee consumption and lifespan (that does not mean one does not exist!) but if it used the most rigorous analyses and the effect appeared to be as strong as the coffee I brew, then I could argue reasonably that our youth need to consider not only the world they are leaving behind for Keith Richards (who seems to be alive in spite of himself), but also for me. If on the other hand, the effect size, corrected for publication bias, is not reliable, then I may want to rethink my coffee habit (if my goal is a long life).
One last point: A lot of media coverage of research is based on press releases. Sometimes science journalists have the luxury of going beyond the press releases, but not always. Press releases often gloss over the finer points of a study, including the many caveats that authors would want to make. I avoid press releases of my work precisely to avoid the possibility of my work being spun in some way that I did not intend. Beyond that, don't just gravitate toward findings that fit your preconceived beliefs. Go outside your bubble and look at research that appears to challenge what you believe. I love coffee a lot. However, if it turns out that the way I consume it is not good for me, I need to know - even if I do not like the answers I find.
No comments:
Post a Comment