Go to main navigation Navigation menu Skip navigation Home page Search

New research | Examining the replicability of online experiments selected by a decision market

A new large-scale replication project of online social science experiments showed a replication rate of about 50%. Anna Dreber Almenberg and Magnus Johannesson, Professors at the Department of Economics at SSE, and co-authors publish a new article in Nature Human Behaviour.

Dreber and Johannesson led a large international collaborative project testing the feasibility of using decision markets to select studies for replication and evaluating the replicability of online experiments. A decision market, where social scientists traded on the outcome of replications of 41 systematically selected MTurk social science experiments published in PNAS, were used to select the 12 studies with the lowest and the 12 with the highest market prices for replication (along with two randomly selected studies). The replication rate, defined as a statistically significant effect in the same direction as the original study, was 83% for the top-12 and 33% for the bottom-12 group and 54% overall. The replication effect sizes were on average 45% of the original effect sizes. The observed replicability of MTurk experiments is similar to that of previous systematic replication projects involving laboratory experiments.

Abstract

Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015–2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.

Dept. of Economics Economics Article Journal News Paper Publication