2021
DOI: 10.1097/acm.0000000000004090
|View full text |Cite
|
Sign up to set email alerts
|

Validity Evidence for Assessing Entrustable Professional Activities During Undergraduate Medical Education

Abstract: Purpose To explore validity evidence for the use of entrustable professional activities (EPAs) as an assessment framework in medical education. Method Formative assessments on the 13 Core EPAs for entering residency were collected for 4 cohorts of students over a 9- to 12-month longitudinal integrated clerkship as part of the Education in Pediatrics Across the Continuum pilot at the University of Minnesota Medical School. The students requested assessme… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
25
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(27 citation statements)
references
References 27 publications
2
25
0
Order By: Relevance
“…Further, supervision requirements significantly decreased with increased level and duration of training 5. In addition, a follow-up study of the undergraduate medical education EPA pilot program resulted in substantive evidence that EPAs provide valid information on progression of trainees towards level of competence 16. Others have also found supporting evidence to validate EPAs as a method of assessment 17 18.…”
Section: Discussionmentioning
confidence: 98%
“…Further, supervision requirements significantly decreased with increased level and duration of training 5. In addition, a follow-up study of the undergraduate medical education EPA pilot program resulted in substantive evidence that EPAs provide valid information on progression of trainees towards level of competence 16. Others have also found supporting evidence to validate EPAs as a method of assessment 17 18.…”
Section: Discussionmentioning
confidence: 98%
“…A recent paper described that validity evidence for an assessment system can be supported with demonstration that growth in performance over time follows a theoretically predictable pattern (such as learning curves) ( 9 ). Figure 3 shows such patterns for all domains of competence that were assessed in learners at OSU-CVM using the CBVE.…”
Section: Discussionmentioning
confidence: 99%
“…What is presented in this study are the preliminary building blocks for a programmatic approach ( 8 ) where assessment data are collected longitudinally and periodically reviewed by an oversight clinical educators' committee that renders high-stakes progress decisions regarding learners ( 5 ). Here, we report on the use of two assessment methods that incorporate scales which have been reported to result in more reliable and valid scores over time ( 9 ). The intent of using these data are to provide students and stakeholders with feedback regarding competency progression through clinical rotations.…”
Section: Introductionmentioning
confidence: 99%
“…One plausible explanation for this is that the EPA framework gives evaluators a shared mental model of standardized student skills and thus facilitates assessments of students even if faculty only work with a student during a single clinic day. The results from this same medical school’s EPA assessment program also demonstrated external validity [ 41 ], demonstrating the potential for EPAs to improve both feedback quantity and reliability.…”
Section: How Epa Framework Offer Solutions For Medical Schoolsmentioning
confidence: 99%