IntroductionThe Transparent Reporting of a multivariable prediction model of Individual Prognosis Or Diagnosis (TRIPOD) statement and the Prediction model Risk Of Bias ASsessment Tool (PROBAST) were both published to improve the reporting and critical appraisal of prediction model studies for diagnosis and prognosis. This paper describes the processes and methods that will be used to develop an extension to the TRIPOD statement (TRIPOD-artificial intelligence, AI) and the PROBAST (PROBAST-AI) tool for prediction model studies that applied machine learning techniques.Methods and analysisTRIPOD-AI and PROBAST-AI will be developed following published guidance from the EQUATOR Network, and will comprise five stages. Stage 1 will comprise two systematic reviews (across all medical fields and specifically in oncology) to examine the quality of reporting in published machine-learning-based prediction model studies. In stage 2, we will consult a diverse group of key stakeholders using a Delphi process to identify items to be considered for inclusion in TRIPOD-AI and PROBAST-AI. Stage 3 will be virtual consensus meetings to consolidate and prioritise key items to be included in TRIPOD-AI and PROBAST-AI. Stage 4 will involve developing the TRIPOD-AI checklist and the PROBAST-AI tool, and writing the accompanying explanation and elaboration papers. In the final stage, stage 5, we will disseminate TRIPOD-AI and PROBAST-AI via journals, conferences, blogs, websites (including TRIPOD, PROBAST and EQUATOR Network) and social media. TRIPOD-AI will provide researchers working on prediction model studies based on machine learning with a reporting guideline that can help them report key details that readers need to evaluate the study quality and interpret its findings, potentially reducing research waste. We anticipate PROBAST-AI will help researchers, clinicians, systematic reviewers and policymakers critically appraise the design, conduct and analysis of machine learning based prediction model studies, with a robust standardised tool for bias evaluation.Ethics and disseminationEthical approval has been granted by the Central University Research Ethics Committee, University of Oxford on 10-December-2020 (R73034/RE001). Findings from this study will be disseminated through peer-review publications.PROSPERO registration numberCRD42019140361 and CRD42019161764.
CONTEXT AND OBJECTIVE:The success of vaccination campaigns depends on the degree of adherence to immunization initiatives and schedules. Risk factors associated with children's failure to receive the measles vaccine at the correct age were studied in the city of São Paulo, Brazil. DESIGN AND SETTING:Case-control and exploratory study, in the metropolitan area of São Paulo. METHODS:The caregivers of 122 children were interviewed regarding their perceptions and understanding about the measles vaccination and the disease. RESULTS:The results showed that age, region of residence, marital status and education level were unrelated to taking measles vaccines adequately. Most individuals remembered being informed about the last annual vaccination campaign by television, but no communication channel was signifi cantly associated with vaccination status. The answers to questions about knowledge of the disease or the vaccine, when analyzed alone, were not associated with taking measles vaccinations at the time indicated by health agencies. The results showed that, when parents felt sorry for their children who were going to receive shots, they delayed the vaccination. Most of the children did not take the measles vaccination on the exactly recommended date, but delayed or anticipated the shots.CONCLUSION: It is clear that there is no compliance with the government's recommended measles vaccination schedule (i.e. fi rst dose at nine and second at 15 months of age, as recommended in 1999 and 2000). Feeling sorry for the children receiving shots can delay vaccination taking.
Background Structured, systematic methods to formulate consensus recommendations, such as the Delphi process or nominal group technique, among others, provide the opportunity to harness the knowledge of experts to support clinical decision making in areas of uncertainty. They are widely used in biomedical research, in particular where disease characteristics or resource limitations mean that high-quality evidence generation is difficult. However, poor reporting of methods used to reach a consensus – for example, not clearly explaining the definition of consensus, or not stating how consensus group panellists were selected – can potentially undermine confidence in this type of research and hinder reproducibility. Our objective is therefore to systematically develop a reporting guideline to help the biomedical research and clinical practice community describe the methods or techniques used to reach consensus in a complete, transparent, and consistent manner. Methods The ACCORD (ACcurate COnsensus Reporting Document) project will take place in five stages and follow the EQUATOR Network guidance for the development of reporting guidelines. In Stage 1, a multidisciplinary Steering Committee has been established to lead and coordinate the guideline development process. In Stage 2, a systematic literature review will identify evidence on the quality of the reporting of consensus methodology, to obtain potential items for a reporting checklist. In Stage 3, Delphi methodology will be used to reach consensus regarding the checklist items, first among the Steering Committee, and then among a broader Delphi panel comprising participants with a range of expertise, including patient representatives. In Stage 4, the reporting guideline will be finalised in a consensus meeting, along with the production of an Explanation and Elaboration (E&E) document. In Stage 5, we plan to publish the reporting guideline and E&E document in open-access journals, supported by presentations at appropriate events. Dissemination of the reporting guideline, including a website linked to social media channels, is crucial for the document to be implemented in practice. Discussion The ACCORD reporting guideline will provide a set of minimum items that should be reported about methods used to achieve consensus, including approaches ranging from simple unstructured opinion gatherings to highly structured processes.
Background: The EQUATOR Network improves the quality and transparency in health research, primarily by promoting awareness and use of reporting guidelines. In 2018, the UK EQUATOR Centre launched GoodReports.org, a website that helps authors find and use reporting guidelines. This paper describes the tool’s development so far. We evaluated user experience and behaviour while using the website as part of manuscript submission to a journal to inform future development. Methods: We conducted a survey to collect data on users’ experience of the GoodReports website during manuscript submission. We assessed the tool’s reliability by checking our agreement with the tool’s checklist recommendation on a random sample of manuscripts submitted to a partner journal. We compared the proportion of authors submitting a reporting checklist alongside their manuscripts between groups exposed or not exposed to the GoodReports tool. We compared the text of manuscripts before an author received a reporting guideline recommendation with the text subsequently submitted to the partner journal. Results: Seventy percent (423/599) of survey respondents rated GoodReports 8 or more out of 10 for usefulness, and 74% (198/267) said they had made changes to their manuscript after using the website. We agreed with the GoodReports reporting guideline recommendation in 84% (72/86) of cases. Of authors who completed the guideline finder questionnaire, 14% (10/69) failed to submit a completed checklist compared to 30% (41/136) who did not use the tool. Of the 69 authors who received a GoodReports reporting guideline recommendation, 20 manuscript pairs were included in a before-and-after study. Five included more information in their methods section after exposure to GoodReports. On average, authors reported 57% of necessary reporting items before completing a checklist on GoodReports.org and 60% after. Conclusion: The data provide encouraging signs that GoodReports could increase the use of reporting guidelines. They also underline the need for reporting guidance to be introduced early in the writing process. We are developing GoodReports by adding more reporting guidelines to the database, and by developing the functionality to integrate reporting items into Word article templates. We will test whether GoodReports users write more complete study reports in a randomised trial.
BackgroundA considerable amount of randomized controlled trials (RCTs) have been published on statins and/or fibrates for diabetic retinopathy, a clinical condition associated with high social and economic burden. Adherence to the CONSORT statement items is imperative to ensure transparency and reproducibility in clinical research. The aim of this study is to assess the reporting quality and the adherence to CONSORT of RCTs assessing statins and/or fibrates for diabetic retinopathy.MethodsWe conducted a critical appraisal study at Discipline of Evidence-based Medicine, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp). A sensitive literature search was performed to identify all relevant RCTs, with no time or language limits. Two authors independently evaluated the reporting quality of the selected RCTs using the CONSORT statement as a standard.ResultsThirteen reports of RCTs were included in this study. The adherence of the reports to CONSORT items ranged from 24% to 68%. The median score was 11 (interquartile range (IQR) 8 to 13). When analyzed separately, the methods sections of the reports had a median of three items (IQR 2 to 4) judged adherent to the methods items of CONSORT (items 3 to 12). The most underreported items were those related to trial design, title and abstract, allocation concealment, implementation of the randomization sequence, and blinding. Other important items, such as the one related to the description of the inclusion criteria, also had low adherence.ConclusionsThe overall adherence to the CONSORT checklist items was poor, especially in the items related to the methods section. RCT reports on statins and/or fibrates for diabetic retinopathy must be optimized to avoid reporting biases and to improve transparency and reproducibility.
Objectives: To systematically assess the kind of placebos used in investigator-initiated randomized controlled trials (RCTs), from where they are obtained, and the hurdles that exist in obtaining them.Study Design and Setting: PubMed was searched for recently published noncommercial, placebo-controlled randomized drug trials. Corresponding authors were invited to participate in an online survey.Results: From 423 eligible articles, 109 (26%) corresponding authors (partially) participated. Twenty-one of 102 (21%) authors reported that the placebos used were not matching (correctly labeled in only one publication). The main sources in obtaining placebos were hospital pharmacies (32 of 107; 30%) and the manufacturer of the study drug (28 of 107; 26%). RCTs with a hypothesis in the interest of the manufacturer of the study drug were more likely to have obtained placebos from the drug manufacturer (18 of 49; 37% vs. 5 of 29; 17%). Median costs for placebos and packaging were US$ 58,286 (IQR US$ 2,428e US$ 160,770; n 5 24), accounting for a median of 10.3% of the overall trial budget.Conclusion: Although using matching placebos is widely accepted as a basic practice in RCTs, there seems to be no standard source to acquire them. Obtaining placebos requires substantial resources, and using nonmatching placebos is common.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.