ObjectivesAs much as 50%–90% of research is estimated to be irreproducible, costing upwards of $28 billion in USA alone. Reproducible research practices are essential to improving the reproducibility and transparency of biomedical research, such as including preregistering studies, publishing a protocol, making research data and metadata publicly available, and publishing in open access journals. Here we report an investigation of key reproducible or transparent research practices in the published oncology literature.DesignWe performed a cross-sectional analysis of a random sample of 300 oncology publications published from 2014 to 2018. We extracted key reproducibility and transparency characteristics in a duplicative fashion by blinded investigators using a pilot tested Google Form.Primary outcome measuresThe primary outcome of this investigation is the frequency of key reproducible or transparent research practices followed in published biomedical and clinical oncology literature.ResultsOf the 300 publications randomly sampled, 296 were analysed for reproducibility characteristics. Of these 296 publications, 194 contained empirical data that could be analysed for reproducible and transparent research practices. Raw data were available for nine studies (4.6%). Five publications (2.6%) provided a protocol. Despite our sample including 15 clinical trials and 7 systematic reviews/meta-analyses, only 7 included a preregistration statement. Less than 25% (65/194) of publications provided an author conflict of interest statement.ConclusionWe found that key reproducibility and transparency characteristics were absent from a random sample of published oncology publications. We recommend required preregistration for all eligible trials and systematic reviews, published protocols for all manuscripts, and deposition of raw data and metadata in public repositories.
Background: Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It is unclear whether there is a lack of reproducibility or transparency in radiologic research. Purpose: To analyze published radiology literature for the presence or lack of key indicators of reproducibility. Methods: This cross-sectional retrospective study was performed by conducting a search of the National Library of Medicine (NLM) for publications contained within journals in the field of radiology. Our inclusion criteria were being MEDLINE indexed, written in English, and published from January 1, 2014, to December 31, 2018. We randomly sampled 300 publications for this study. A pilot-tested Google form was used to record information from the publications regarding indicators of reproducibility. Following peer-review, we extracted data from an additional 200 publications in an attempt to reproduce our initial results. The additional 200 publications were selected from the list of initially randomized publications. Results: Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met inclusion criteria and 6 did not. Among the empirical publications, 5.6% (11/195, [3.0-8.3]) contained a data availability statement, 0.51% (1/195) provided clear documented raw data, 12.0% (23/191, [8.4-15.7]) provided a materials availability statement, 0% provided analysis scripts, 4.1% (8/195, [1.9-6.3]) provided a preregistration statement, 2.1% (4/195, [0.4-3.7]) provided a protocol statement, and 3.6% (7/195, [1.5-5.7]) were preregistered. The validation study of the 5 key indicators of reproducibility-availability of data, materials, protocols, analysis scripts, and pre-registration-resulted in 2 indicators (availability of protocols and analysis scripts) being reproduced, as they fell within the 95% confidence intervals for the proportions from the original sample. However, materials' availability and pre-registration proportions from the validation sample were lower than what was found in the original sample. Conclusion: Our findings demonstrate key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce studies contained in radiology publications may be problematic and may have potential clinical implications.
Background: Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It remains unexamined whether there is a lack of reproducibility or transparency in radiologic research. Purpose: The purpose of this study was to analyze published radiology literature for the presence or absence of key indicators of reproducibility. Methods: This cross-sectional, retrospective study was performed by conducting a search of the National Library of Medicine to identify publications contained within journals in the field of Radiology. Journals that were not written in English or MEDLINE indexed were excluded from the analysis. Studies published from January 1, 2014 to December 31, 2018 were used to generate a random list of 300 publications for this meta-analysis. A pilot-tested, Google form was used to evaluate key indicators of reproducibility in the queried publications. Results: Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met the inclusion criteria. Among the empirical publications, 5.6% contained a data availability statement (11/195, 95% CI: 3.0-8.3), 0.51% provided clearly documented raw data (1/195), 12.0% provided a materials availability statement (23/191, 8.4-15.7), none provided analysis scripts, 4.1% provided a preregistration statement (8/195, 1.9-6.3), 2.1% provided a protocol statement (4/195, 0.4-3.7), and 3.6% were preregistered (7/195, 1.5-5.7). Conclusion: Our findings demonstrate that key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce radiological studies may be problematic and may have potential clinical implications.
Introduction: Reproducibility is critical to diagnostic accuracy and treatment implementation. Concurrent with clinical reproducibility, research reproducibility establishes whether the use of identical study materials and methodologies in replication efforts permits researchers to arrive at similar results and conclusions. In this study, we address this gap by evaluating nephrology literature for common indicators of transparent and reproducible research. Methods: We searched the National Library of Medicine catalog to identify 36 MEDLINE-indexed, Englishlanguage nephrology journals. We randomly sampled 300 publications published between January 1, 2014, and December 31, 2018. Results: Our search yielded 28,835 publications, of which we randomly sampled 300 publications. Of the 300 publications, 152 (50.7%) were publicly available, whereas 143 (47.7%) were restricted through paywall and 5 (1.7%) were inaccessible. Of the remaining 295 publications, 123 were excluded because they lack empirical data necessary for reproducibility. Of the 172 publications with empirical data, 43 (25%) reported data availability statements and 4 (2.3%) analysis scripts. Of the 71 publications analyzed for preregistration and protocol availability, 0 (0.0%) provided links to a protocol and 8 (11.3%) were preregistered. Conclusion: Our study found that reproducible and transparent research practices are infrequently used by the nephrology research community. Greater efforts should be made by both funders and journals. In doing so, an open science culture may eventually become the norm rather than the exception.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.