2023
DOI: 10.1177/09567976221140828
|View full text |Cite
|
Sign up to set email alerts
|

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science

Abstract: In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 arti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(17 citation statements)
references
References 44 publications
1
16
0
Order By: Relevance
“…The findings of our study are consistent with previous research indicating that open data badges or mandatory open data policies do not necessarily guarantee actual data availability or independent computational reproducibility (Crüwell et al, 2023; Hardwicke et al, 2018, 2021). In the current study, we did not contact any of the original authors if the data were unavailable or were not accompanied by sufficient information to be reusable, but we assume the number of reproducibility checks and MI tests we could perform would have been higher if we had done so.…”
Section: Discussionsupporting
confidence: 91%
“…The findings of our study are consistent with previous research indicating that open data badges or mandatory open data policies do not necessarily guarantee actual data availability or independent computational reproducibility (Crüwell et al, 2023; Hardwicke et al, 2018, 2021). In the current study, we did not contact any of the original authors if the data were unavailable or were not accompanied by sufficient information to be reusable, but we assume the number of reproducibility checks and MI tests we could perform would have been higher if we had done so.…”
Section: Discussionsupporting
confidence: 91%
“…(e.g., Foster & Deardorff, 2017; McKiernan et al, 2016; Open Science Collaboration, 2015; Pineau et al, 2021) in order to better support reproducibility and replicability. These principles also extend to computational reproducibility , that focuses on aspects such as the documentation of code, the software environment, and version control that may also impact the replicability and reproducibility of findings (e.g., Crüwell et al, 2023; Stodden et al, 2018). 10 In the context of personality research, encouraging researchers to adopt Open Science practices, and to prioritize the creation, testing, and reporting of reproducible solutions could therefore revolutionize the way personality is assessed and understood, irrespective of the discipline it is reported in.…”
Section: Discussionmentioning
confidence: 99%
“…Open Science practices are also increasingly being required as part of journals' submission policies, for example, seeCrüwell et al (2023) andBrysbaert et al (2021). Thus, researchers would be well-advised not only to adopt these practices to improve reproducibility and replicability, but also to ensure that their work aligns with changes in journal/conference policies which will likely impact all disciplines and outlets eventually.PREDICTING PERSONALITY FROM DIGITAL DATA…”
mentioning
confidence: 99%
“…Crüwell et al [ 15 ] evaluate the computational reproducibility of 14 articles published in Psyhocological Science . Among these articles, the paper by Hilgard et al [ 16 ] has been rated as having “package dependency issues”.…”
Section: Case Studiesmentioning
confidence: 99%