2021
DOI: 10.1098/rsos.211037
|View full text |Cite
|
Sign up to set email alerts
|

Comparing dream to reality: an assessment of adherence of the first generation of preregistered studies

Abstract: Preregistration is a method to increase research transparency by documenting research decisions on a public, third-party repository prior to any influence by data. It is becoming increasingly popular in all subfields of psychology and beyond. Adherence to the preregistration plan may not always be feasible and even is not necessarily desirable, but without disclosure of deviations, readers who do not carefully consult the preregistration plan might get the incorrect impression that the study was exactly conduc… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
65
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 69 publications
(82 citation statements)
references
References 28 publications
(33 reference statements)
1
65
0
Order By: Relevance
“…For example, Bakker et al (2018) note that even determining the exact number of hypotheses in a preregistration was so difficult that interrater agreement on this variable was as low as 14%. Similarly, Claesen et al (2021) state that '[a]ssessing the adherence of the published studies to the preregistration plans proved to be a far from trivial task' because 'neither the preregistration plans nor the published studies were written in sufficient detail for a fair comparison' (p. 4), and van den Akker (2021) remarks that '(r)esearchers are very bad at clearly laying out hypotheses in preregistrations (and in papers)' (p. 31).…”
Section: Repercussions For Scientific Reformmentioning
confidence: 98%
“…For example, Bakker et al (2018) note that even determining the exact number of hypotheses in a preregistration was so difficult that interrater agreement on this variable was as low as 14%. Similarly, Claesen et al (2021) state that '[a]ssessing the adherence of the published studies to the preregistration plans proved to be a far from trivial task' because 'neither the preregistration plans nor the published studies were written in sufficient detail for a fair comparison' (p. 4), and van den Akker (2021) remarks that '(r)esearchers are very bad at clearly laying out hypotheses in preregistrations (and in papers)' (p. 31).…”
Section: Repercussions For Scientific Reformmentioning
confidence: 98%
“…All items in this protocol correspond with the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols Statement (PRISMA-P; Moher et al, 2016 ). Corresponding to the PRISMA guidelines ( Shamseer et al, 2015 ; Moher et al, 2016 ), our review protocol was registered with the Open Science Framework on 9 March 2021 ( Claesen et al, 2021 ). Registration number and link: 10.17605/OSF.IO/RQ82B .…”
Section: Methodsmentioning
confidence: 99%
“…For example, Bakker et al (2018) noted that even determining the exact number of hypotheses in a given preregistration had been so difficult that inter-rater agreement on this variable was as low as 14%. Similarly, Claesen, Gomes, Tuerlinckx, and Vanpaemel (2021) stated that '[a]ssessing the adherence of the published studies to the preregistration plans proved to be a far from trivial task' because 'neither the preregistration plans nor the published studies were written in sufficient detail for a fair comparison' (p. 4), and van den Akker (2021) remarked that '(r)esearchers are very bad at clearly laying out hypotheses in preregistrations (and in papers)' (p. 31).…”
Section: Repercussions For Scientific Reformmentioning
confidence: 98%