2016
DOI: 10.1016/j.eururo.2015.11.028
|View full text |Cite
|
Sign up to set email alerts
|

Measuring to Improve: Peer and Crowd-sourced Assessments of Technical Skill with Robot-assisted Radical Prostatectomy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
76
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 108 publications
(76 citation statements)
references
References 10 publications
0
76
0
Order By: Relevance
“…In addition, there may be unmeasured patient factors that we have been unable to assess. Finally, while crowd-sourced GEARS assessment is a valid method of assessing global robotic surgical technical skill in urology, 8,25 in our study, crowd-sourced reviewers could not differentiate between UIS and control videos. While platforms like C-SATS have a role in generating high-volume assessment and feedback, crowd-sourcing has yet to demonstrate the ability to discern between clinically relevant patient outcomes, unlike peer-review.…”
Section: Discussionmentioning
confidence: 88%
See 2 more Smart Citations
“…In addition, there may be unmeasured patient factors that we have been unable to assess. Finally, while crowd-sourced GEARS assessment is a valid method of assessing global robotic surgical technical skill in urology, 8,25 in our study, crowd-sourced reviewers could not differentiate between UIS and control videos. While platforms like C-SATS have a role in generating high-volume assessment and feedback, crowd-sourcing has yet to demonstrate the ability to discern between clinically relevant patient outcomes, unlike peer-review.…”
Section: Discussionmentioning
confidence: 88%
“…These assessments have been demonstrated to correlate with the assessments of content experts. 8 Crowd workers are paid a nominal fee for each video clip they rate. The raters enlisted by C-SATS are trained in use of the Global Evaluative Assessment of Robotic Skill (GEARS) assessment tool, which consists of Likert scale scoring across six domains of robotic skill.…”
Section: Crowd-sourcingmentioning
confidence: 99%
See 1 more Smart Citation
“…A recent study showed that medically trained reviewers were able to identify surgeons with higher complication rates by watching videos of their technique in the context of laparoscopic bariatric surgery [14]. Surprisingly, healthcare consumers (via crowdsourcing among the general population) were also able to identify surgeons with higher complication rates by watching videos of their operative technique in the context of robotic radical prostatectomy [15]. Based on this data, patients may be justified in choosing surgeons who post videos online, since they appear to be able discern good from bad surgeons by viewing examples of their best work.…”
Section: Discussionmentioning
confidence: 99%
“…The authors compared assessments by expert raters (robotic and open surgeons) and those from lay people from the Crowd-Sourced Assessment of Technical Skills (C-SATS) 1,2 group with regard to experienced surgeons' technical ability during a robotic-assisted radical cystectomy. The authors prepared short video segments (60 seconds) showing mobilization of the ureter, ureteral preparation for the anastomosis, and the ureteral-ileal anastomosis from nine (out of a potential 102) cases that resulted in clinically significant postoperative uretero-ileal strictures (UIS) (10 strictures in total).…”
mentioning
confidence: 99%