2017
DOI: 10.1038/s41562-016-0021
|View full text |Cite
|
Sign up to set email alerts
|

A manifesto for reproducible science

Abstract: Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

15
2,155
0
57

Year Published

2017
2017
2021
2021

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 2,538 publications
(2,378 citation statements)
references
References 100 publications
15
2,155
0
57
Order By: Relevance
“…Insufficient data documentation has often been discussed as a major obstacle for replication studies (Ioannidis et al, 2009;Peters et al, 2016;Munafò et al, 2017). A way to increase the reproducibility of research could, therefore, be the implementation and adoption of data documentation and availability policies by scientific journals and funding agencies.…”
Section: Discussion and Policy Implicationsmentioning
confidence: 99%
“…Insufficient data documentation has often been discussed as a major obstacle for replication studies (Ioannidis et al, 2009;Peters et al, 2016;Munafò et al, 2017). A way to increase the reproducibility of research could, therefore, be the implementation and adoption of data documentation and availability policies by scientific journals and funding agencies.…”
Section: Discussion and Policy Implicationsmentioning
confidence: 99%
“…Our story is only one potential path because there are many ways to upgrade scientific practiceswhether collaborating only with your 'future self ' or as a team-and they depend on the shared commitment of individuals, institutions and publishers 6,16,17 . We do not review the important, ongoing work regarding data management architecture and archiving 8,18 , work flows 11,[19][20][21] , sharing and publishing data [22][23][24][25] and code [25][26][27] , or how to tackle reproducibility and openness in science [28][29][30][31][32] . Instead, we focus on our experience, because it required changing the way we had always worked, which was extraordinarily intimidating.…”
mentioning
confidence: 99%
“…The “Principle of Maximum Bootstrapping” outlined by Kelty et al (2008) is highly congruent with this social ideal for peer review, where new systems are based on existing communities of expertise, quality norms, and mechanisms for review. Diversifying peer review in such a manner is an intrinsic part of a system of reproducible research ( Munafò et al , 2017). Making use of persistent identifiers such as DataCite , CrossRef , and ORCID will be essential in binding the social and technical aspects of this to an interoperable, sustainable and open scholarly infrastructure ( Dappert et al , 2017).…”
Section: Resultsmentioning
confidence: 99%