Background: Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It is unclear whether there is a lack of reproducibility or transparency in radiologic research. Purpose: To analyze published radiology literature for the presence or lack of key indicators of reproducibility. Methods: This cross-sectional retrospective study was performed by conducting a search of the National Library of Medicine (NLM) for publications contained within journals in the field of radiology. Our inclusion criteria were being MEDLINE indexed, written in English, and published from January 1, 2014, to December 31, 2018. We randomly sampled 300 publications for this study. A pilot-tested Google form was used to record information from the publications regarding indicators of reproducibility. Following peer-review, we extracted data from an additional 200 publications in an attempt to reproduce our initial results. The additional 200 publications were selected from the list of initially randomized publications. Results: Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met inclusion criteria and 6 did not. Among the empirical publications, 5.6% (11/195, [3.0-8.3]) contained a data availability statement, 0.51% (1/195) provided clear documented raw data, 12.0% (23/191, [8.4-15.7]) provided a materials availability statement, 0% provided analysis scripts, 4.1% (8/195, [1.9-6.3]) provided a preregistration statement, 2.1% (4/195, [0.4-3.7]) provided a protocol statement, and 3.6% (7/195, [1.5-5.7]) were preregistered. The validation study of the 5 key indicators of reproducibility-availability of data, materials, protocols, analysis scripts, and pre-registration-resulted in 2 indicators (availability of protocols and analysis scripts) being reproduced, as they fell within the 95% confidence intervals for the proportions from the original sample. However, materials' availability and pre-registration proportions from the validation sample were lower than what was found in the original sample. Conclusion: Our findings demonstrate key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce studies contained in radiology publications may be problematic and may have potential clinical implications.
Background: Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It remains unexamined whether there is a lack of reproducibility or transparency in radiologic research. Purpose: The purpose of this study was to analyze published radiology literature for the presence or absence of key indicators of reproducibility. Methods: This cross-sectional, retrospective study was performed by conducting a search of the National Library of Medicine to identify publications contained within journals in the field of Radiology. Journals that were not written in English or MEDLINE indexed were excluded from the analysis. Studies published from January 1, 2014 to December 31, 2018 were used to generate a random list of 300 publications for this meta-analysis. A pilot-tested, Google form was used to evaluate key indicators of reproducibility in the queried publications. Results: Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met the inclusion criteria. Among the empirical publications, 5.6% contained a data availability statement (11/195, 95% CI: 3.0-8.3), 0.51% provided clearly documented raw data (1/195), 12.0% provided a materials availability statement (23/191, 8.4-15.7), none provided analysis scripts, 4.1% provided a preregistration statement (8/195, 1.9-6.3), 2.1% provided a protocol statement (4/195, 0.4-3.7), and 3.6% were preregistered (7/195, 1.5-5.7). Conclusion: Our findings demonstrate that key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce radiological studies may be problematic and may have potential clinical implications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.