Journal article

Meta-studies for robust tests of theory

Beth Baribault, Chris Donkin, Daniel R Little, Jennifer Trueblood, Zita Oravecz, Don van Ravenzwaaij, Corey White, Paul De Boeck, Joachim Vandekerckhove

Center for Open Science

Abstract

We describe and demonstrate an empirical strategy useful for discovering and replicating empirical effects in psychological science. The method involves the design of a meta-study, in which many independent experimental variables—that may be moderators of an empirical effect—are indiscriminately randomized. Radical randomization yields rich data sets that can be used to test the robustness of an empirical claim to some of the vagaries and idiosyncrasies of experimental protocols and enhances the generalizability of these claims. The strategy is made feasible by advances in hierarchical Bayesian modeling which allow for the pooling of information across unlike experiments and designs, and is ..

View full abstract

Citation metrics