Go to main navigation Navigation menu Skip navigation Home page Search

Do social science research findings published in Nature and Science replicate?

Replications of 21 high-profile social science findings demonstrate challenges for reproducibility of research published in the most prestigious scientific journals. The replication effect magnitudes were about 50% smaller than the original studies, and 8 of the 21 replications failed to find significant evidence for the original finding. Researchers betting in prediction markets were very accurate in predicting which findings would replicate and which would not. The limited reproducibility highlights the importance of improving science policies and practices to improve research credibility. The study is published today in Nature Human Behaviour and is a large international collaboration involving researchers from for instance the Stockholm School of Economics.

Today, in Nature Human Behavior, a collaborative team of five laboratories published the results of 21 high-powered replications of social science experiments published in Science and Nature, two of the most prestigious journals in science. The team found that 13 of the 21 (62%) replications showed significant evidence consistent with the original hypothesis.  Also, on average, the replication studies showed effect sizes that were about 50% smaller than the original studies. Together this suggests that reproducibility is imperfect even among studies published in the most prestigious journals in science.   

“These results show that “statistically significant” scientific findings need to be interpreted very cautiously until they have been replicated even if published in the most prestigious journals,” said Magnus Johannesson of the Stockholm School of Economics, another of the project leaders. 

Prior to conducting the replications, the team set up prediction markets for other researchers to bet and earn (or lose) money based on whether they thought each of the findings would replicate. The markets were highly accurate in predicting which studies would later succeed or fail to replicate. 

Anna Dreber of the Stockholm School of Economics, another project leader, added: “Using prediction markets could be another way for the scientific community to use resources more efficiently and accelerate discovery.” 

Brian Nosek, executive director of the Center for Open Science, and one of the co-authors, noted “Someone observing these failures to replicate might conclude that science is going in the wrong direction. In fact, science’s greatest strength is its constant self-scrutiny to identify and correct problems and increase the pace of discovery.”  This large-scale replication project is just one part of an ongoing reformation of research practices. Researchers, funders, journals, and societies are changing policies and practices to nudge the research culture toward greater openness, rigor, and reproducibility. 

A comprehensive information page for the SSRP Project including contacts, relevant articles, links to the papers, and supplemental materials can be accessed here. 

The following researchers from the Stockholm School of Economics participated in the Study:

Anna Dreber, Professor, Department of Economics
Magnus Johannesson, Professor, Department of Economics
Adam Altmejd, PhD student, Department of Economics
Emma Heikensten, PhD student, Department of Economics
Siri Isaksson, PhD student, Department of Economics

 

For more information, contact:

Anna Dreber, anna.dreber@hhs.se, +46 8 7369646

Magnus Johannesson, magnus.johannesson@hhs.se, +46 8 7369443

 


 

About Center for Open Science

The Center for Open Science (COS) is a non-profit technology and culture change organization  founded in 2013 with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools. The OSF is a web application that provides a solution for the challenges facing researchers who want to pursue open science practices, including: a streamlined ability to manage their work; collaborate with others; discover and be discovered; preregister their studies; and make their code, materials, and data openly accessible. Learn more at cos.io and osf.io.

Contacts for the Center for Open Science

Media: Rusty Speidel: rusty@cos.io | 434-284-3403

SSRP & General Commentary: Brian Nosek: nosek@cos.io
    (Executive Assistant Dawne Aycock, dawne@cos.io)

Policy Reform Commentary: David Mellor: david@cos.io

Web: https://cos.io/SSRP