For scientists who find themselves in the crosshairs, the experience can feel bruising. Several years ago, the Tilburg group—now more than a dozen faculty members and students—unveiled an algorithm, dubbed statcheck, to spot potential statistical problems in psychology studies. They ran it on tens of thousands of papers and posted the troubling results on PubPeer, a website for discussion of published papers. Some researchers felt unfairly attacked; one eminent psychologist insinuated that the group was part of a “self-appointed data police” harassing members of the research community.
Van Assen and Wicherts say it was worth stepping on some toes to get the message across, and to flag mistakes in the literature. Members of the group have become outspoken advocates for statistical honesty, publishing editorials and papers with tips for how to avoid biases, and they have won fans. “I'm amazed that they were able to build that group. It feels very progressive to me,” says psychologist Simine Vazire of the University of California, Davis, a past chair of the executive committee of the Society for the Improvement of Psychological Science (SIPS).
The work by the Tilburg center and others, including SIPS and COS, is beginning to have an impact. The practice of preregistering studies—declaring a plan for the research in advance, which can lessen the chance of dodgy analyses—is growing rapidly (see story, p. 1192), as is making the data behind research papers immediately available so others can check the findings. Wicherts and others are optimistic that the perverse incentives of careerist academia, to hoard data and sacrifice rigor for headline-generating findings, will ultimately be fixed. “We created the culture,” Nosek says. “We can change the culture.”
Read the rest. One of the really cool things is finding their work in PubPeer (a website we social psychologists should utilize much more). This group's statcheck software, and what it can do is truly amazing and necessary. Let's just say that when I see the name Nuijten in the comments for a particular article, I pay keen attention. Among the people I respect in my area, statcheck has generally found no errors or only minor errors that don't change the basic thrust of their findings. Among some others, well, that's another story.
This is a useful article, and one that makes clear that although there is a bit of a paradigm shift in our field, we're far from a warm embrace of an open science approach. I am optimistic that the expectations for open data sharing, open sharing of research protocols prior to running research, etc., will be far more favorable this time next decade, but I am girding myself for the possibility that it may take considerably longer to get to that point. I am guessing that when the paradigm truly shifts, it will seem sudden. The momentum was there already, and thankfully we've gone well beyond the mere talk of change that my cohort basically managed. So there is that. Progress of a sort.
Be smart: the main motive of the various data sleuths is to make our science better. These are people who are not trying to destroy careers or hurt feelings, but rather are simply making sure that the work we do is as close an approximation to the truth as is humanly possible. My advice for mid and late career researchers is to embrace this new paradigm rather than resist. I can guarantee that the new generation of psychological scientists will not have the patience for business as usual.
Post a Comment