2017
DOI: 10.21037/tp.2016.12.01
|View full text |Cite
|
Sign up to set email alerts
|

A pilot validation study of crowdsourcing systematic reviews: update of a searchable database of pediatric clinical trials of high-dose vitamin D

Abstract: This study demonstrates the accuracy of crowdsourcing for systematic review citations screening, with retention of all eligible articles and a significant reduction in the work required from the investigative team. Together, these two findings suggest that crowdsourcing could represent a significant advancement in the area of systematic review. Future directions include further study to assess validity across medical fields and determination of the capacity of a non-medical crowd.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

2
17
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 22 publications
(19 citation statements)
references
References 52 publications
2
17
0
Order By: Relevance
“…The primary outcome for the feasibility component was the number of citations that achieved the target number of independent assessments. Consistent with our initial pilot study, feasibility success was a priori defined as achieving a minimum of 4 independent assessments per citation [23]. The primary outcome for the validation component was the ability of the crowd to identify and retain eligible studies at the abstract level (sensitivity).…”
Section: Methodsmentioning
confidence: 99%
“…The primary outcome for the feasibility component was the number of citations that achieved the target number of independent assessments. Consistent with our initial pilot study, feasibility success was a priori defined as achieving a minimum of 4 independent assessments per citation [23]. The primary outcome for the validation component was the ability of the crowd to identify and retain eligible studies at the abstract level (sensitivity).…”
Section: Methodsmentioning
confidence: 99%
“…In addition, we hand-searched: (1) references of included studies and relevant review articles; (2) conference abstracts for major gastroenterology meetings (Digestive Diseases Week, American College of Gastroenterology, and United European Gastroenterology Week) and major thoracic meetings (American Thoracic Society Meeting, European Respiratory Society, and American Academy of Chest Physicians Conference) from 2013 to 2017. The review of abstracts and articles identified for full-text review was conducted via crowd sourcing using CrowdScreen SR 34 . Crowdsourcing has previously been shown to increase the efficiency of the systematic review process, while maintaining high accuracy during the review process 34 , 35 .…”
Section: Methodsmentioning
confidence: 99%
“…The review of abstracts and articles identified for full-text review was conducted via crowd sourcing using CrowdScreen SR 34 . Crowdsourcing has previously been shown to increase the efficiency of the systematic review process, while maintaining high accuracy during the review process 34 , 35 . Prior to screening, each member of the CrowdScreen SR Review team was asked to review a test set of 15 abstracts identified by the principal investigator (MEK).…”
Section: Methodsmentioning
confidence: 99%
“…Studies were excluded if they (1) were case reports or case series (fewer than ten patients), (2) performed vitamin D measurement after death (i.e., study on sudden infant death), (3) included adults but did not report study findings separately for children, and (4) were focused on specific diseases or interventions (i.e., cardiac surgery, prematurity or very low birth weight, acute lower respiratory tract infection, hematopoietic stem cell transplant). Interventional studies were not considered, because a recent scoping review of pediatric clinical trials (with an online searchable database) did not identify any potentially relevant publications [ 45 , 46 ].…”
Section: Methodsmentioning
confidence: 99%
“…Study eligibility was determined through two screening levels (Additional file 3 ). Title and abstract screening was performed by four authors (JDM, NN, KO, KI), followed by full-text review of potentially relevant citations by two independent authors (JDM, NN) using an online platform as previously described [ 45 ]. Disagreements between reviewers were resolved by consensus.…”
Section: Methodsmentioning
confidence: 99%