Aims Having already demonstrated that the Paediatric Carers of Children Feedback (PaedCCF) tool is reliable and valid in a national pilot study with consultant paediatricians 1 the aim of this second pilot was to determine if the tool is also useful with SSASG doctors not on the specialist register. Methods Self-selecting participants were each sent 35 forms to be distributed locally to carers of children for completion following outpatient consultations. RCPCH-analysed feedback returned to doctors included self assessment scores, carer scores, and overall cohort scores for each question. Qualitative comments were also included. Paediatricians' views on feasibility were sought before and after receiving feedback by online survey. Results 58 consultants returned 1512 forms (mean 26 per doctor). All doctors scored highly (mean ratings >4.1 out of 5). Aggregate whole-instrument rating was 4.42 (SD = 0.63). Self assessment scores (mean 3.71, SD 0.45) were lower than carer scores (mean 4.42, SD 0.63) p<0.001. White doctors scored more highly than non white doctors (p<0.001) but the difference in the mean was small. Carers who had seen the doctor more than 10 times rated doctors higher than those that had seen the doctor 1-4 times (p<0.001); scores across individual questions were highly correlated (reliability of 0.97 using Cronbach's Alpha) which justifi es the use of an aggregate score. Fewer than 25 consultation ratings were needed for good reliability (D-study). 93.4% of consultants found the tool acceptable as evidence for revalidation. Use of pre-paid envelopes assisted some doctors but concerns continue to be expressed about need for administrative support for a paper based feedback tool. Conclusion This second pilot extends the range of the use of PaedCCF as evidence to support revalidation. Data from this pilot has also allowed a revision of the number of forms required for reliability to be reduced to 20.
This article was migrated. The article was not marked as recommended. The Royal College of Paediatrics and Child Health (RCPCH) developed a new end-of-training assessment held for the first time in 2012, known as START, the Specialty Trainee Assessment of Readiness for Tenure as a consultant. It is a novel, formative, multi-scenario, OSCE-style, out-of-workplace assessment using unseen scenarios with generic, external assessors undertaken in the trainees' penultimate training year. This paper describes the introduction and structure of this formative assessment. While many other colleges have summative exit exams the inception of this assessment was designed to be formative, providing feedback on consultant-readiness skills, and not a high-stakes hurdle towards the end of training, It was developed from the College's examinations question-setting group and following two pilots in 2009 and 2010, the assessment evolved and the first live diet was held in November 2012.
A section has been added near the beginning of the paper which outlines the Royal College of Pediatrics and Child Health's Programme of Assessments for trainees in the Specialty Training. This puts the assessment reported in this paper into that wider context, as suggested by the reviewer.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.