2012
DOI: 10.1007/s10459-012-9372-1
|View full text |Cite
|
Sign up to set email alerts
|

Seeing the same thing differently

Abstract: Assessors' scores in performance assessments are known to be highly variable. Attempted improvements through training or rating format have achieved minimal gains. The mechanisms that contribute to variability in assessors' scoring remain unclear. This study investigated these mechanisms. We used a qualitative approach to study assessors' judgements whilst they observed common simulated videoed performances of junior doctors obtaining clinical histories. Assessors commented concurrently and retrospectively on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

3
69
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 147 publications
(85 citation statements)
references
References 49 publications
3
69
1
Order By: Relevance
“…Thus we cannot stipulate the magnitude of the effect our intervention may have had on the participants. However, our findings are in line with findings in recent studies (Govaerts et al 2011(Govaerts et al , 2013Kogan et al 2011;Yeates et al 2013). …”
Section: Discussionsupporting
confidence: 94%
See 1 more Smart Citation
“…Thus we cannot stipulate the magnitude of the effect our intervention may have had on the participants. However, our findings are in line with findings in recent studies (Govaerts et al 2011(Govaerts et al , 2013Kogan et al 2011;Yeates et al 2013). …”
Section: Discussionsupporting
confidence: 94%
“…Internal sources of information, which are unique to each individual rater, are also essential to the ability to form assessments. These C. St-Onge et al internal sources are in a constant state of flux as the rater acquires additional experiences with trainees and with other evaluators (a finding that is consistent with the observations from Kogan et al (2011) and Yeates et al (2013).…”
Section: Discussionsupporting
confidence: 71%
“…Whether this challenge is addressed from a psychometric perspective, for example generalizability theory [1], or from social cognition frameworks [25], student grades should depend as little as possible on who examines them.…”
Section: Introductionmentioning
confidence: 99%
“…Several studies have shown that clinician examiners manifest different levels of severity, and that this has a significant impact on examinee grades and assessment decisions across a range of clinical skills assessments. These include work-based assessments, oral examinations and OSCEs [58]. Although extreme differences in examiner severity are probably relatively uncommon [9] modest differences in examiner severity may make important differences to student grades and pass-fail decisions.…”
Section: Introductionmentioning
confidence: 99%
“…Although a variety of approaches to assessment exist, the implementation of competency‐based education highlights the role of performance‐based assessment, particularly rater‐based assessment approaches that recognise the developmental trajectories of learners. In order to ensure the efficacy and defensibility of assessment, and to test the alignment between the intended uses and practices of assessment, recent work has shed light on the cognitive underpinnings and processes at play in rater‐based assessment . The majority of the broader literature on raters in health professions education (HPE) has focused on cognitive or contextual factors that influence rater judgements.…”
Section: Introductionmentioning
confidence: 99%