8/27/15 ASHLAND, Ohio – Dr. Christopher R. Chartier, assistant professor of psychology at Ashland University, and two of his AU students were among the 270 co-authors of a report investigating the reproducibility of psychological science that was published today in Science Magazine.
Launched nearly four years ago, the study, titled Reproducibility Project: Psychology, is the most comprehensive investigation ever completed regarding the rate and predictors of reproducibility in a field of science, according to Chartier.
The project conducted replications of 100 published findings of three prominent psychology journals. The international team found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study.
Two of the replications were carried out in Chartier’s Social Decision Making laboratory at Ashland University, and both closely reproduced the originally published results.
“The results of this collaborative investigation identify significant challenges in reproducing research findings, as well as suggest predictors of reproducibility and sound research practices that can improve our confidence in published findings,” Chartier said.
Chartier added that science is unique from other ways of gaining knowledge by relying on reproducibility to gain confidence in ideas and evidence. “Reproducibility means that the results recur when the same data are analyzed again, or when new data are collected using the same methods,” he said.
The AU students who worked with Chartier and are listed as co-authors on the project are Olivia Perna, a senior from Tiffin, Ohio, and Taylor Nervi of Alliance, Ohio, who graduated with Bachelor of Science degree in Psychology this past spring.
Perna, who is majoring in Psychology, said that “Replication efforts are important for any scientific discipline, not just psychology. Our trust in any particular finding grows with each successful demonstration of a phenomenon.”
Chartier noted that while fewer than half of the original findings were successfully replicated and that this held true across multiple different criteria of success, the team noted three basic reasons this might occur.
“One, even though most replication teams worked with the original authors to use the same materials and methods, small differences in when, where or how the replication was carried out might have influenced the results,” he said. “Two, the replication might have failed to detect the original result by chance; and three, the original result might have been a false positive.”
Johanna Cohoon, a project coordinator from the Center for Open Science, concluded that “the findings demonstrate that reproducing original results may be more difficult than is presently assumed, and interventions may be needed to improve reproducibility.”
In keeping with the goals of openness and reproducibility, every replication project posted its methods on a public website, and later added their raw data and computer code for reproducing their analyses.
According to Chartier, many organizations, funders, journals and publishers are already working on improving reproducibility. For example, in 2014, the journal Psychological Science - one of the journals included in this study - implemented practices such as badges to acknowledge open sharing of materials and data to improve reproducibility.
“Efforts include increasing transparency of original research materials, code and data so that other teams can more accurately assess, replicate and extend the original research, and pre-registration of research designs to increase the robustness of the inferences drawn from the statistical analyses applied to research results,” said Denny Borsboom, a team member from the University of Amsterdam who was also involved in creation of the Transparency and Openness Promotion (TOP) Guidelines published in Science in June.
Since the Reproducibility Project began in 2011, similar projects have emerged in other fields such as the Reproducibility Project: Cancer Biology. A discipline of metascience is emerging - scientific research about scientific research.
These and the widespread efforts to improve research transparency and reproducibility are indications that, as suggested by Dr. Chartier, “the scientific community is taking seriously the type self-examination and self-correction necessary to improve research quality and public confidence in published findings. My students and I plan to actively continue our support of and participation in these metascientific efforts here at Ashland University.”
Ashland University is a mid-sized, private university conveniently located a short distance from Akron, Cleveland and Columbus, Ohio. Ashland University (www.ashland.edu) values the individual student and offers a unique educational experience that combines the challenge of strong, applied academic programs with a faculty and staff who build nurturing relationships with their students. ###