The Center for Open Science and team of collaborators is conducting a DARPA-funded project to assess and improve reproducibility of research in the social and behavioral sciences. We created a large database of claims from papers from across the social-behavioral sciences. A subset of these claims have been randomly selected for conducting replication studies. We have completed sourcing of labs to do new data collection replications for the first phase. We are now sourcing individuals and teams to help find existing datasets that can test prior claims in using different data than the original study.
Contributors receive a spreadsheet with about 500 papers from many social-behavioral areas of research including public administration, political science, economics, criminology, sociology, marketing, business, education, psychology, and more. Contributors search for potential datasets that could be used to replicate the original papers claims. If they find promising datasets, they submit them for preliminary review. If approved in preliminary review, the data finders prepare the dataset for potential replication. The prepared dataset is then reviewed for meeting project criteria and passed to a data analyst team who will prepare a preregistration for the analysis plan and conduct the analysis. Data finders receive $2000 for completing the data preparation.
If you are interested in contributing as a data finder for the SCORE program, or want to hear about future opportunities to join this large collaboration, then please fill out this brief interest form. Project leads for the existing data replication -- Andrew Tyner (
andrewtyner@cos.io) or Anna Abatayo (
anna@cos.io) -- will then reach out and provide more information.
For more general inquiries about the project, contact: Brian Nosek (PI:
nosek@cos.io), Tim Errington (COS's Director of Research:
tim@cos.io), or Bea Arendt (Program Manager:
beatrix@cos.io).