Recently Curated

Content type
Collections of Replications
Reanalyses - Reproducibility/Robustness
Reanalyses - Meta-analyses (traditional)
Original research
Registered Report format
Preregistered design + analysis
Preregistered study design
Open study materials
Open data
Open code/Code Ocean capsule
Reporting standard compliance

Search among 140 articles and 6 collections reporting 1,161 replications of 205 effects in the social and life sciences. For replication details, view associated article or collection (if available; or see Replication Page (table view).


Our Approach

(See our white paper for full details of our approach.)

Science requires transparency. Different transparency standards, however, apply to different kinds of empirical research. Curate Science is a community platform to organize the transparency of published findings according to the relevant standards. It does so for the 3 most fundamental aspects of transparency:

Method/Data Transparency
Effect Replicability Transparency

Transparent new sample replications of published effects.
Analytic Reproducibility Transparency
Transparent analytic reproducibility and robustness re-analyses.
Analytic Reproducibility:
A study's primary result is reproducible by repeating the same statistical analyses (and data processing choices) on the data.
Analytic Robustness:
A study's primary result is robust across all justifiable (alternative) statistical analyses and data processing choices.

The platform allows researchers to label the transparency of their empirical research: Think nutritional labels for scientific papers (but much more!). This publicly recognizes researchers who take the extra effort to report their research transparently, but also maximizes the re-use, value, and impact of research (in addition to several other distinct benefits for various research stakeholders, see our white paper's Table 1 for a full list of benefits).

Practicing what we preach, the platform is developed openly, with an open-source code base, an open license, and eventually an open API. For more details regarding the theoretical framework that guides the design of our platform, see our recently published paper at Advanced in Methods and Practices in Psychological Science (LeBel, McCarthy, Earp, Elson, & Vanpaemel, in press).

Thanks to a 2-year grant from the European Commission (Marie-Curie grant), we will soon be scaling up the website to allow curation at a larger scale.


Current Contributors
Current contributors are helping with conceptual developments of Curate Science, writing/editing of related manuscripts, and/or with curation.

Etienne P. LeBel
KU Leuven
Founder & Lead

Wolf Vanpaemel
KU Leuven

Touko Kuusi
University of Helsinki

Randy McCarthy
Northern Illinois University

Brian Earp
University of Oxford

Malte Elson
Ruhr University Bochum
Current Advisory Board (as of June 2017)
Advisory board members periodically provide feedback on grant proposal applications and related manuscripts and general advice regarding Curate Science's current focus areas and future directions.

Susann Fiedler
Max Planck Institute - Bonn

Anna van't Veer
Leiden University

Julia Rohrer
Max Planck Institute - Berlin

Michèle Nuijten
Tilburg University

Dorothy Bishop
University of Oxford

Brent Roberts
University of Illinois - Urbana-Champaign

Hal Pashler
University of California - San Diego

Daniel Simons
University of Illinois - Urbana-Champaign

Alex Holcombe
University of Sydney

E-J Wagenmakers
University of Amsterdam

Katie Corker
Grand Valley State University

Simine Vazire
University of California – Davis

Richard Lucas
Michigan State University

Marco Perugini
University of Milan-Bicocca

Lorne Campbell
University of Western Ontario

Eric Eich
University of British Columbia

Mark Brandt
Tilburg University
Technical Advisors

Alex Kyllo

Mike Morrison

Frequently Asked Questions


What was the original inspiration for Curate Science?

The idea behind Curate Science originated in 2014 amidst the bustling early days of the "open science movement" in psychology. Several new transparency and replication initiatives were emerging. The idea was to try to organize all this information in one place, creating a kind of public commons for the research community (or "science-commons", which was our original name).

Who started Curate Science?

Curate Science was started by 2 academic researchers (Etienne LeBel and Christian Battista) and 2 volunteer Silicon Valley software developers.

Why has progress been so slow?

Progress has been slow for various reasons. Curate Science has operated as a side project with very limited funding from 2014 to 2017 (with only occasional help from volunteer and paid freelance software developers; e.g., Alex Kyllo). Also, complexities emerged regarding the different ways transparency and replications are reported across studies and articles. However, based on curating over 1,200 replications of 200+ effects reported in hundreds of articles in the social and life sciences (the largest known curation effort of its kind), we have now developed a highly flexible ontology that is able to accommodate the curation of transparency and replications from heterogeneous kinds of studies and articles.

Present/Current Focus

What is our current focus/main activities?

We're currently focused on 2 main activities: (1) Curating the transparency of empirical articles (with respect to 5 fundamental transparency categories, see above) and (2) Tracking (new sample) replications of published effects. This primarily involves manual curation, however, we rely on several tools to increase curation efficiency and accuracy (e.g., various R scripts from the "shiny" R package and article metadata extraction tools using "scholar" and "rcrossref" R packages).

Who is currently working on Curate Science?

Etienne LeBel (project founder) is currently the main contributor. Wolf Vanpaemel contributes in major ways conceptually and in terms of grant funding leveraging. Touko Kuusi is currently the main volunteer curator. Randy McCarthy, Brian Earp, and Malte Elson (and Vanpaemel) are substantial contributors to the overarching framework white paper that guides the design and development of the platform (recently published at Advanced in Methods and Practices in Psychological Science (LeBel, McCarthy, Earp, Elson, & Vanpaemel, in press). Finally, our 17-person advisory board, composed of trailblazers in the transparency and replication movement, provide regular feedback regarding grant proposal applications and Curate Science activities.

Who is funding Curate Science?

We're currently funded by a 2-year grant from the European Commission (Marie-Curie grant). We have submitted two other large grants currently under review (totalling > CAD$1,000,000 over 4-years) to fund additional software developers and PhD student curators/editors (one from the not-for-profit sector, the other from a Belgium public granting agency).

Misconceptions/Public Relations

Is Curate Science a "debunking website"?

No. Curate Science is a platform to organize and track the transparency and replication information of empirical research as accurately and impartially as possible to allow the community of researchers to carefully interpret published findings in nuanced ways.

Is Curate Science a "central authority" that provide "official stamps of approval" of trustworthy research?

No. Similar to our approach to curating replications, our goal is to curate the transparency of empirical research as accurately as possible, rather than adjucating the quality of research. This then allows the community of researchers to more effectively scrutinize published findings. And given the crowdsourced nature of the platform (coming soon), we will be the opposite of a central authority: transparency and replications will be curated by the broadest/most inclusive group of researchers possible, thus maximizing theoretical and viewpoint diversity.

Curate Science seems to have good intentions, but isn't it going to "stigmatize" older research conducted according to different standards?

Kind of, but no. It is true that today's (much needed) higher transparency standards in some ways make older research seem less impressive. However, Curate Science is committed to rewarding positive scientific behaviors rather than punishing questionable behaviors. Indeed, we make it easy to get the most credit possible for conducting and reporting one's research only a little bit more transparently. That is, as transparent as you currently have time for and/or are comfortable with. For example, if you're uncomfortable publicly posting your data for an article, you could still earn credit by publicly posting your code and linking to it on Curate Science (only 1 transparency component is required to be eligible for being added to our database).

Future directions/Road Map

I want to add my transparently reported articles. When will I be able to do so?

We're currently finalizing specs to allow larger scale crowdsourced curation, which will soon be implemented. We will then test the platform with a small group of beta testers in the Fall of 2018. We plan to open up the platform to a larger group of researchers in early 2019. Sign up to receive our newsletter to get regular updates on our progress!

What is coming ahead?

Many new features are currently in development that will make it even easier to access and interact with publicly available study components. See this page for some of these new features.


Please sign up below to receive the Curate Science Newsletter to be notifed about news and updates.
See past announcements.