Preregistration, which involves documentation of hypotheses, methods, and plans for data analysis prior to data collection or analysis, has been lauded as 1 potential solution to the replication crisis in psychological science. Yet, many researchers have been slow to adopt preregistration, and the next generation of researchers is offered little formalized instruction in creating comprehensive preregistrations. In this article, we describe a collaborative workshop-based preregistration course designed and taught by Jennifer L. Tackett. We provide a brief overview of preregistration, including resources available, common concerns with preregistration, and responses to these concerns. We then describe the goals, structure, and evolution of our preregistration course and provide examples of enrolled students' final research products. We conclude with reflections on the strengths and opportunities for growth for the 1st iteration of this course and suggestions for others who are interested in implementing similar open science-focused courses in their training programs.
Public Significance StatementPreregistration-or the public posting of plans for a study prior to its completion-is a tool that shows great promise for increasing scientific rigor. However, this practice has not yet been adopted by the majority of psychology researchers. In this article, the authors detail their approach to creating a workshop-based class on preregistration that makes preregistration accessible to multiple areas of psychology.
This study investigated the utility of four WAIS-IV Digit Span (DS) indices (traditional Reliable Digit Span [RDS], RDS-Working Memory [RDS-WM], RDS-Revised [RDS-R], and DS Age-Corrected Scaled Score [ACSS]) as embedded performance validity tests (PVTs) among a sample of 342 consecutive adults referred for neuropsychological evaluation of ADHD. All DS indices had acceptable classification accuracy (areas under the curve: .73–.76) for detecting invalid performance with optimal cut-scores of RDS ≤7 (35% sensitivity/93% specificity), RDS-WM ≤7 (56% sensitivity/86% specificity), RDS-R ≤12 (48% sensitivity/85% specificity), and ACSS ≤7 (46% sensitivity/87% specificity). Although all indices were able to detect invalid performance, DS indices incorporating the more complex working memory trials of the task yielded the best accuracy for identification of invalid test performance among adults referred for ADHD evaluation.
The preregistration, protocol, measure, analytic code, and supplemental results have been archived on the Open Science Framework at https://osf.io/gpjr9/.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.