2020
DOI: 10.31223/x5zk5v
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reproducible Research and GIScience: an evaluation using GIScience conference papers

Abstract: GIScience conference authors and researchers face the same computational reproducibility challenges as authors and researchers from other disciplines who use computers to analyse data. Here, to assess the reproducibility of GIScience research, we apply a rubric for assessing the reproducibility of 75 conference papers published at the GIScience conference series in the years 2012-2018. Since the rubric and process were previously applied to the publications of the AGILE conference series, this paper itself is … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…Nevertheless, structured metadata for processes, managed together with dataset metadata is often lacking. This insufficient documentation of processes and datasets was identified by researchers with particular interest in the reproducibility of data creation processes (Baker, 2016; Lemos et al., 2012; Ostermann et al., 2021). Moreover, researchers lack support in generating and updating provenance information directly from the analysis workflows, which hampers efforts to produce detailed provenance information.…”
Section: Approaches and Pilots To Integrate Core Software Components ...mentioning
confidence: 99%
“…Nevertheless, structured metadata for processes, managed together with dataset metadata is often lacking. This insufficient documentation of processes and datasets was identified by researchers with particular interest in the reproducibility of data creation processes (Baker, 2016; Lemos et al., 2012; Ostermann et al., 2021). Moreover, researchers lack support in generating and updating provenance information directly from the analysis workflows, which hampers efforts to produce detailed provenance information.…”
Section: Approaches and Pilots To Integrate Core Software Components ...mentioning
confidence: 99%
“…For example, Ostermann and Granell (2017) use a literature review of volunteered geographic information research publications to assess computational reproducibility based on availability of original data, metadata, source code, or pseudocode. Researchers taking part in an ongoing reproducible research initiative of the Association of Geographic Information Laboratories (AGILE) in Europe have reviewed the computational reproducibility of 31 research papers submitted to the annual conference for the past three years (Nüst et al 2020, 2021, 2022) and 75 papers from the GIScience conference series (Ostermann et al 2021). In addition to assessing the availability of data, methods (code), and results, the researchers also attempted to independently re‐execute the analyses and share their findings as short reproducibility reports.…”
Section: The Reproduction Of Geographic Researchmentioning
confidence: 99%
“…In the validation phase, researchers analyse, visualise, interpret, and validate results while sharing preliminary findings in working papers and conferences. Surveys of publications presented in the AGILE (Nüst et al, 2018) and GIScience (Ostermann et al, 2021) conferences found the majority of papers irreproducible due to missing metadata, data, and procedures. At this phase, the overall project and any public project component can be registered and assigned a persistent link like the DOI through digital repositories like Open Science Framework (OSF) or figshare.…”
Section: Validationmentioning
confidence: 99%