2021
DOI: 10.1111/medu.14654
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation in health professions education—Is measuring outcomes enough?

Abstract: Introduction:In an effort to increase the rigour of evaluation in health professions education (HPE), a range of evaluation approaches are used. These largely focus on outcome evaluation as opposed to programme evaluation. We aim to review and critique the use of outcome evaluation models, using the Kirkpatrick Model as an example given its wide acceptance and use, and advocate for the use of programme evaluation models that help us understand how and why outcomes are occurring.Methods: We systematically searc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
44
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(66 citation statements)
references
References 69 publications
0
44
0
Order By: Relevance
“…We followed Stufflebeam’s Context-Input-Process–Product model [ 15 ], a program development technique that allows for curriculum evaluation at each stage of the process with a greater focus on “how and why the programme worked and what else happened” [ 16 ]. Recognizing that this would be a small, rapid, and dynamic initiative with limited opportunity for outcomes-based program evaluation and subsequent revision, this technique was expected to better inform design and gauge impact [ 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…We followed Stufflebeam’s Context-Input-Process–Product model [ 15 ], a program development technique that allows for curriculum evaluation at each stage of the process with a greater focus on “how and why the programme worked and what else happened” [ 16 ]. Recognizing that this would be a small, rapid, and dynamic initiative with limited opportunity for outcomes-based program evaluation and subsequent revision, this technique was expected to better inform design and gauge impact [ 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…5,6 Educators, in turn, must contemplate the role context plays in enabling or preventing them from judging competence and the reasons why different curricular activities may or may not be effective at particular moments or in particular settings. 7,8 Many medical educators have embraced the real world as part of their teaching, particularly when striving to help students transition from the classroom into their clinical years. 10 However, we still tend to overlook that the quality and consistency of real-world contexts are difficult to control and that such variation can impact on students' learning experience and progression.…”
Section: Connectionsmentioning
confidence: 99%
“…Whether addressing clinical reasoning, learner assessment, programme evaluation or learning environments, these articles each portray how as the level of contextual dependence rises under different circumstances, so does the challenge it poses-and by extension the degree of expertise it necessitates-for its constructive, intentional and responsible use, measure and study. 2,[6][7][8][9][10][11] Such complexity becomes evident even when defining context. In a conceptual scoping review, Bates and Ellaway present one definition as a 'dynamic and ever-changing system that emerges from underlying patterns of patients, locations, practice, education and society, and from the unpredictable interactions between these patterns'.…”
Section: Connectionsmentioning
confidence: 99%
“…Moving from the individual to the programme, Allen et al8 present a comprehensive review of the literature suggesting that while the proliferation of the Kirkpatrick Model as a framework for educational interventions has improved the rigour of outcomes evaluations, we need to pay greater attention to the 'how and why' underpinning educational interventions. That is, they posit that we should move beyond outcome evaluation to programme evaluation and that the Kirkpatrick Model, focusing as it does exclusively on outcomes, is limited in shaping our understanding of a particular outcome's root cause and in explaining unintended consequences.…”
mentioning
confidence: 99%