Introduction:We created a suturing skills assessment tool that comprehensively defines criteria around relevant subskills of suturing and confirmed its validity.Methods: Five expert surgeons and an educational psychologist participated in a cognitive task analysis to deconstruct robotic suturing into an exhaustive list of technical skill domains and subskill descriptions. Using the Delphi methodology, each cognitive task analysis element was systematically reviewed by a multi-institutional panel of 16 surgical educators and implemented in the final product when content validity index reached !0.80. In the subsequent validation phase, 3 blinded reviewers independently scored 8 training videos and 39 vesicourethral anastomoses using EASE (End-to-End Assessment of Suturing Expertise); 10 vesicourethral anastomoses were also scored using RACE (Robotic Anastomosis Competency Evaluation), a previously validated but simplified suturing assessment tool. Inter-rater reliability was measured with intra-class correlation for normally distributed values and prevalence-adjusted bias-adjusted Kappa for skewed distributions. Expert (!100 prior robotic cases) and trainee (<100 cases) EASE scores from the non-training cases were compared using a generalized linear mixed model.