2019
DOI: 10.1038/sdata.2019.30
|View full text |Cite|
|
Sign up to set email alerts
|

Assessing data availability and research reproducibility in hydrology and water resources

Abstract: There is broad interest to improve the reproducibility of published research. We developed a survey tool to assess the availability of digital research artifacts published alongside peer-reviewed journal articles (e.g. data, models, code, directions for use) and reproducibility of article results. We used the tool to assess 360 of the 1,989 articles published by six hydrology and water resources journals in 2017. Like studies from other fields, we reproduced results for only a small fraction of articles (1.6% … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
82
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 89 publications
(92 citation statements)
references
References 50 publications
1
82
0
Order By: Relevance
“…Comparing our estimate of 30% empirical reproducibility, to similar findings of 38% (Vanpaemel et al, 2015), 32% (Hardwicke and Ioannidis, 2018), 26% (Wicherts et al, 2006), 19% (Vines et al, 2014; extending data lifetime appears consistently difficult to achieve, across disciplines. Our estimate of 80% analytical reproducibility is difficult to quantitatively compare against similar audits, which have found reproducibility of results to be anywhere between 83% (Andrew et al, 2015), 70% (Gilbert et al, 2012), and 1.1% (Stagge et al, 2019) of surveyed publications. This is likely because criteria for defining a successful reproduction effort, given materials, are currently ambiguous.…”
Section: Implications For Evolutionary Anthropology and Ecologymentioning
confidence: 64%
“…Comparing our estimate of 30% empirical reproducibility, to similar findings of 38% (Vanpaemel et al, 2015), 32% (Hardwicke and Ioannidis, 2018), 26% (Wicherts et al, 2006), 19% (Vines et al, 2014; extending data lifetime appears consistently difficult to achieve, across disciplines. Our estimate of 80% analytical reproducibility is difficult to quantitatively compare against similar audits, which have found reproducibility of results to be anywhere between 83% (Andrew et al, 2015), 70% (Gilbert et al, 2012), and 1.1% (Stagge et al, 2019) of surveyed publications. This is likely because criteria for defining a successful reproduction effort, given materials, are currently ambiguous.…”
Section: Implications For Evolutionary Anthropology and Ecologymentioning
confidence: 64%
“…This person could be a new student who needs to get up to speed on the methods or someone else interested in the study or results. • Ask this colleague, student, or other person to use a reproducibility survey tool (e.g., Stagge et al 2019) to provide feedback on the repository, directions, and results that they reproduced. • If the person can reproduce results, acknowledge their effort in the manuscript's data availability or results reproducibility section.…”
Section: Verify Your Results Are Reproduciblementioning
confidence: 99%
“…Follow Good Examples 1. Adopt the practices of six articles that Stagge et al (2019) awarded badges to for full and partial reproducible results. For example, these papers • Provided all model and code in a Github (Buscombe 2017;Neuwirth 2017;Xu et al 2017), institutional (Yu et al 2017), or HydroShare (Horsburgh et al 2017) repository.…”
Section: Verify Your Results Are Reproduciblementioning
confidence: 99%
See 1 more Smart Citation
“…To minimize the effect of this loss and meet FAIR standards, it is critical to also include detailed information about the anonymization procedure via metadata and sharing code, ideally using open-source tools integrating version control for transparency, to allow for interoperability and usability by other researchers (Bakker, 2019;Lowndes et al, 2017;Stagge et al, 2019). When possible, researchers should leave jurisdiction of sensitive data to the agencies responsible for collecting and warehousing these data; where there is no such organization, they should provide synthetic examples of the data so that others can understand and replicate the anonymization procedure.…”
Section: Sharing Private Datamentioning
confidence: 99%