2018
DOI: 10.1016/j.jclinepi.2018.07.015
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing critical appraisal of research evidence (CrowdCARE) was found to be a valid approach to assessing clinical research quality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 25 publications
0
9
0
Order By: Relevance
“…Harnessing the use of crowdsourcing, a free online platform, CrowdCARE (http://crowdcare.unimelb.edu.au), has also been recently developed to provide clinicians with access to pre‐appraised research evidence . Moving forward, the availability of improved clinical guidance, to inform clinical care, would be predicted to provide a mechanism for supporting more consistent, and evidence‐based care provision.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Harnessing the use of crowdsourcing, a free online platform, CrowdCARE (http://crowdcare.unimelb.edu.au), has also been recently developed to provide clinicians with access to pre‐appraised research evidence . Moving forward, the availability of improved clinical guidance, to inform clinical care, would be predicted to provide a mechanism for supporting more consistent, and evidence‐based care provision.…”
Section: Discussionmentioning
confidence: 99%
“…Harnessing the use of crowdsourcing, a free online platform, CrowdCARE (crowdcare.unimelb.edu.au), 46 has also been recently developed to provide clinicians with access to pre-appraised research evidence. 47 Moving forward, the availability of improved clinical guidance, 48 to inform clinical care, would be predicted to provide a mechanism for supporting more consistent, and evidence-based care provision. Remaining challenges include ensuring that such guidelines are updated to be consistent with a rapidly evolving evidence base, and the dissemination and implementation of the resource(s) to clinicians.…”
Section: Discussionmentioning
confidence: 99%
“…This platform is unique in that it goes beyond the use of crowdsourced judgments by article type (e.g. RCTs), as in CochraneCrowd to an in-depth assessment of the methodological rigour of the articles [72].…”
Section: Health Data Systemsmentioning
confidence: 99%
“…Such an approach is desirable across the sciences for validation, accuracy, and in reducing bias. The critical research appraisal tool CrowdCARE, for instance, has shown that novices can be trained to appraise the rigor of published systematic reviews and, on average, achieve a high degree of accuracy relative to the experts [72].…”
Section: Challenges and Limitationsmentioning
confidence: 99%
“…Scientists would nevertheless benefit from a basic understanding of the methodology, strengths and weaknesses of systematic review and meta-analysis. Meta-analyses are often performed in collaborations, and a recent feasibility study using crowd-sourcing for clinical study quality assessment suggests that this could be a way forward, since experts and novices obtained the same results (Pianta et al 2018). Combined with recently developed and highly promising machine learning algorithms (Bannach-Brown et al 2019), collaborative efforts could increase the pace and reduce human error in systematic reviews and meta-analysis.…”
Section: Working Together To Improve Nonclinical Data Reliabilitymentioning
confidence: 99%