Crowdsourcing the credibility of empirical research.

Scientists can only rely on an empirical finding if it is credible. In science, a credible finding is one that has (at minimum) survived scrutiny along 3 dimensions: (1) method/data transparency, (2) analytic reproducibility/robustness, and (3) effect replicability. Curate Science is a platform to crowdsource the credibility of empirical research by curating its transparency, reproducibility/robustness, and replicability.

UPDATE (April 19, 2018): New unified curation framework released (version 5.2.0) and important 2-year grant secured to scale up the platform (see here for details).

Curated List of Large-Scale Replication Efforts

Searchable table of N=1,058 replications of 168 effects from the cognitive and social psychology published literature.

Examples: "RPP" for Reproducibility Project: Psychology replications; "ML1" or "ML3" for Many Labs 1 or 3 replications; "RRR" for Registered Replication Reports; "SP: Spec" for Social Psychology's Special Issue replications. For topical searches, try "priming", "anchoring", "gambler's fallacy", "love", "moral" (for morality), or "power posing."

Icon legend: = data; = study materials; = pre-registered protocol; = link to a replication's associated evidence collection. To sort replications, click on column headers. Reveal overflow text (...) by hovering over cell. For details about replication outcome values hover over cell and see about section. For additional replication study characteristics (& to see hidden imprecise large-scale-effort replications, see our public gSheet (see also our GitHub repo ).