2016
DOI: 10.4300/jgme-d-15-00550.1
|View full text |Cite
|
Sign up to set email alerts
|

Learnings From the Pilot Implementation of Mobile Medical Milestones Application

Abstract: Background Implementation of the educational milestones benefits from mobile technology that facilitates ready assessments in the clinical environment. We developed a point-of-care resident evaluation tool, the Mobile Medical Milestones Application (M3App), and piloted it in 8 North Carolina family medicine residency programs.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(13 citation statements)
references
References 6 publications
0
9
0
Order By: Relevance
“…However, given the evolution of clinical care from paper-based to electronic platforms, it makes intuitive sense that the recording, completion and submission of direct observations may be facilitated by using handheld devices or other electronic platforms. The few studies done in this realm have documented the feasibility of and user satisfaction with an electronic approach, but more research is necessary to understand how to optimize electronic platforms both to promote the development of shared goals, support observation quality and collect and synthesize observations [ 159 162 ].…”
Section: Resultsmentioning
confidence: 99%
“…However, given the evolution of clinical care from paper-based to electronic platforms, it makes intuitive sense that the recording, completion and submission of direct observations may be facilitated by using handheld devices or other electronic platforms. The few studies done in this realm have documented the feasibility of and user satisfaction with an electronic approach, but more research is necessary to understand how to optimize electronic platforms both to promote the development of shared goals, support observation quality and collect and synthesize observations [ 159 162 ].…”
Section: Resultsmentioning
confidence: 99%
“…One com-mon barrier has been the lack of time and competing demands such as clinical workload that interfere with the ability of faculty to complete these assessments [12,13]. In order to facilitate more efficient capture, delivery, and aggregation of assessment data, mobile applications have been developed and tested in multiple specialties (e.g., pediatrics, surgical specialties, internal medicine) and with multiple frameworks (milestones, competencies, and entrustment scales) [14][15][16][17][18][19][20][21]. A second important barrier has been challenges with the assessment frameworks; the competencies and milestones used on workplace-based assessments are viewed by some as too numerous, too granular, and/or too abstract for educators to use [22].…”
Section: Introductionmentioning
confidence: 99%
“…To date, most studies of assessment apps have examined apps that use frameworks other than EPAs and have focused on outcomes such as end-user satisfaction via surveys (e.g., attitudes), the quality of the feedback (e.g., specificity), and feasibility (e.g., time to complete) [15,[18][19][20][30][31][32][33][34][35][36]. A few of these studies have identified barriers (e.g., competing demands on faculty time and lack of a physician champion) and enablers (perceived value) to implementation [15,17,19]. No study to date has used implementation science frameworks to focus on the implementation process itself.…”
Section: Introductionmentioning
confidence: 99%
“…The research team consisted of two family medicine faculty and a research associate, all of whom have experience in the Delphi technique, and have previously been involved in, and published on, research related to giving and receiving feedback in graduate medical education. [10][11][12] As a team, the researchers reviewed ORIGINAL ARTICLES…”
Section: Delphi Survey Processmentioning
confidence: 99%
“…Using case study methodology similar to Bodenheimer et al, 9 we gleaned learnings from over 50 graduate medical education programs across the United States, most of which are family medicine residencies, to implement and sustain direct observation feedback tools. [10][11][12] Through sustained work with 27 residency teaching practices in a regional primary care residency collaborative active in three states, Donahue, Newton, Page and colleagues collected data showing wide variation in clinical quality and utilization. 13,14 Through this combined work, Page et al identified variation in program needs and general readiness to adopt feedback practices, and also discerned a possible association between a culture of continuous quality improvement and programs with enhanced elements of feedback practices.…”
mentioning
confidence: 99%