2017
DOI: 10.4300/jgme-d-17-00086.1
|View full text |Cite
|
Sign up to set email alerts
|

Nuance and Noise: Lessons Learned From Longitudinal Aggregated Assessment Data

Abstract: Our model suggests that residents begin at different points and progress at different rates. Meta-raters such as program directors and Clinical Competency Committee members should bear in mind that progression may take time and learning trajectories will be nuanced. Individuals involved in ratings should be aware of sources of noise in the system, including the raters themselves.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
43
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

5
1

Authors

Journals

citations
Cited by 25 publications
(45 citation statements)
references
References 32 publications
(33 reference statements)
1
43
0
1
Order By: Relevance
“…However, the systematic application of learning analytics to interpret these data is sporadic. Some groups have started using learning analytics to gather information and gain insights about learner‐ or system‐level performance (e.g., the McMaster Modular Assessment Program for emergency medicine, online modules that using learning analytics methods to teach x‐ray interpration, and an internal medicine program's analytics dashboards). The vast majority of residency programs, however, are attempting to execute programmatic assessment (i.e., the integrated system of multiple, longitudinal observations from multiple observers, aggregated into summary performance scores for group adjudication of global judgment) without optimized data collection (e.g., valid testing/simulations, timely and accurate workplace‐based assessments), modern analytic techniques, or appropriate data representation.…”
Section: The (Brief) History Of Learning Analytics In Medical Educationmentioning
confidence: 99%
See 3 more Smart Citations
“…However, the systematic application of learning analytics to interpret these data is sporadic. Some groups have started using learning analytics to gather information and gain insights about learner‐ or system‐level performance (e.g., the McMaster Modular Assessment Program for emergency medicine, online modules that using learning analytics methods to teach x‐ray interpration, and an internal medicine program's analytics dashboards). The vast majority of residency programs, however, are attempting to execute programmatic assessment (i.e., the integrated system of multiple, longitudinal observations from multiple observers, aggregated into summary performance scores for group adjudication of global judgment) without optimized data collection (e.g., valid testing/simulations, timely and accurate workplace‐based assessments), modern analytic techniques, or appropriate data representation.…”
Section: The (Brief) History Of Learning Analytics In Medical Educationmentioning
confidence: 99%
“…Similarly, when we make decisions about a trainee based on our data analyses, we must also bear in mind the consequences of our decisions affecting their later behaviors and performances, both good and bad. While early identification of trainees at risk may be valuable so programs can offer earlier personalized remediation plans, the very act of labeling a “trainee‐at‐risk” may have consequences toward their self‐perceptions and future performance . At the same time, the implications of not collecting and interpreting data may also be problematic.…”
Section: With Great Data Comes Great Responsibility: the Consequencesmentioning
confidence: 99%
See 2 more Smart Citations
“…With this transition, scholarship is essential to inform the rollout refinement of a completely different curriculum. The new emphasis on multiple, low stakes, workplace-based assessments is supported by emerging [12][13][14] literature that asks more questions than it provides answers.…”
Section: The Future: New Challenges and Opportunitiesmentioning
confidence: 99%