Background: Teaching complex motor skills at a high level remains a challenge in medical education. Established methods often involve large amounts of teaching time and material. The implementation of standardized videos in those methods might help save resources. In this study, video-based versions of Peyton's '4-step Approach' and Halsted's 'See One, Do One' are compared. We hypothesized that the video-based '4-step Approach' would be more effective in learning procedural skills than the 'See One, Do One Approach'. Methods: One-hundred-two naïve students were trained to perform a structured facial examination and a Bellocq's tamponade with either Halsted's (n = 57) or Peyton's (n = 45) method within a curricular course. Steps 1 (Halsted) and 1-3 (Peyton) were replaced by standardized teaching videos. The performance was measured directly (T1) and 8 weeks (T2) after the intervention by blinded examiners using structured checklists. An item-analysis was also carried out. Results: At T1, performance scores significantly differed in favor of the video-based '4-step Approach' (p < 0.01) for both skills. No differences were found at T2 (p < 0.362). The item-analysis revealed that Peyton's method was significantly more effective in the complex subparts of both skills. Conclusions: The modified video-based version of Peyton's '4-step Approach' is the preferred method for teaching especially complex motor skills in a large curricular scale. Furthermore, an effective way to utilize Peyton's method in a group setting could be demonstrated. Further studies have to investigate the long-term learning retention of this method in a formative setting.
Background: Ensuring that all medical students achieve adequate clinical skills remains a challenge, yet the correct performance of clinical skills is critical for all fields of medicine. This study analyzes the influence of receiving feedback by teaching associates in the context of achieving and maintaining a level of expertise in complex head and skull examination. Methods: All third year students at a German university who completed the obligatory surgical skills lab training and surgical clerkship participated in this study. The students were randomized into two groups. Control group: lessons by an instructor and peer-based practical skills training. Intervention group: training by teaching associates who are examined as simulation patients and provided direct feedback on student performance. Their competency in short-and long-term competence (directly after intervention and at 4 months after the training) of head and skull examination was measured. Statistical analyses were performed using SPSS Statistics version 19 (IBM, Armonk, USA). Parametric and non-parametric test methods were applied. As a measurement of correlation, Pearson correlations and correlations via Kendall's-Tau-b were calculated and Cohen's d effect size was calculated. Results: A total of 181 students were included (90 intervention, 91 control). Out of those 181 students 81 agreed to be videotaped (32 in the control group and 49 in the TA group) and examined at time point 1. At both time points, the intervention group performed the examination significantly better (time point 1, p = <.001; time point 2 (rater 1 p = .009, rater 2 p = .015), than the control group. The effect size (Cohens d) was up to 1.422. Conclusions: The use of teaching associates for teaching complex practical skills is effective for short-and long-term retention. We anticipate the method could be easily translated to nearly every patient-based clinical skill, particularly with regards to a competence-based education of future doctors.
IntroductionAccording to German regulations on licensing to practice medicine, the aim of undergraduate medical training is to produce a scientifically and practically trained physician who is able to work independently. More precisely, medical training has to impart the required knowledge and skills in diagnostics, therapy, health promotion, prevention, and rehabilitation. From the young residents’ point of view, this aim is not achieved, and they do not feel prepared to be a doctor. However, the literature on this subject relies mostly on data based on surveys, and there is a lack of deep analysis of the specific details of the topic. The aim of this study was to analyze in depth how junior doctors in their first and second years felt about their preparation for clinical practice as a doctor from their undergraduate training, as well as which teaching formats and factors influence their preparedness.MethodsThis semi-qualitative study is based on recorded interviews conducted using a structured interview manual. This serves to limit the subject matter of the interview and to target the topics. The study participants were 35 residents of general and visceral surgery, trauma surgery, and urology in their first and second years of medical specialty training. The number of participants was defined by the concept of saturation of the content. Basic data regarding age and the location and length of study were collected using a questionnaire. The audio recordings were transcribed word by word and analyzed with structured qualitative content analysis techniques.ResultsOnly 43% (n=15) of the 35 participating residents stated they were sufficiently prepared to be a doctor from undergraduate medical training, and 22.9% stated that they were not prepared for their work as a resident (8/35). However, 34.3% of the residents stated that undergraduate medical training did prepare them for some of the parts they were expected to master in daily clinical practice, but not other parts. Most of the participants described their first weeks as doctors as particularly stressful and exhausting. As major hurdles during their daily clinical work, participants described knowledge gaps regarding organizational and administrative pathways (71%), deficits in linking knowledge to clinical reasoning (71%), decision making (54%), and therapy planning (51%). Most participants stated that the practical placements during the semester, the clinical clerkships, and the last year internship were most effective as preparation for clinical residency. To be better prepared for clinical practice, participants suggested providing a clearer structure and that the course subjects bear better relations to each other. Nearly all participants proposed increasing patient encounters directly from the beginning of medical training as a longitudinal approach.DiscussionEven though we were able to demonstrate an increase in residents’ preparedness, 57% of the study participants still felt unprepared for their job to some extent. One might argue that starting a new profession will always result in a feeling of being unprepared to some extent. However, this unpreparedness can increase the risk for patients’ well being due to medical errors, which actually represents the third leading cause of death in the US after malignant tumors and cardiovascular diseases. Structured on-the-job adjustment, structured qualification training, and guided professional training are becoming increasingly important for future doctors as selection criteria for career choice and choice of employer. Thus, the surgical disciplines that are struggling with new young residents have to improve their concepts.
Purpose: In daily clinical practice, sterile working conditions, as well as patient safety and self-protection, are essential. Thus, these skills should be taught appropriately during undergraduate training. Receiving constructive feedback can significantly improve future performance. Furthermore, reviewing one's performance using video tools is a useful approach. This study investigates the impact of different modes of video feedback on the acquisition of practical surgical skills, including wound management and a bedside test. Methods: Third-year medical students completed a structured training of practical skills as part of their mandatory surgery rotation. All students received the same practical skills training for performing wound management and a bedside test. However, for feedback regarding their performance, students were assigned to one of four study groups: expert video feedback (receiving feedback by an expert after reviewing the recorded performance), peer video feedback (receiving feedback by a fellow student after reviewing the recorded performance), standard video (giving feedback to a standardized video of the skill), or oral feedback (receiving feedback by an expert without a video record). Afterwards, students completed two 5-minute OSCE stations in which they were assessed with respect to their acquired competencies. Effects on long-term retention were measured at two further measurement points. Results: A total of 199 students were included in the study (48 for expert video feedback, 49 for peer video feedback, 52 for standard video feedback, and How to cite this paper:
Background Feedback is an essential element of learning. Despite this, students complain about receiving too little feedback in medical examinations, e.g., in an objective structured clinical examination (OSCE). This study aims to implement a written structured feedback tool for use in OSCEs and to analyse the attitudes of students and examiners towards this kind of feedback. Methods The participants were OSCE examiners and third-year medical students. This prospective study was conducted using a multistage design. In the first step, an unstructured interrogation of the examiners formed the basis for developing a feedback tool, which was evaluated and then adopted in the next steps. Results In total, 351 students and 51 examiners participated in this study. A baseline was created for each category of OSCE station and was supplemented with station-specific items. Each of these items was rated on a three-point scale. In addition to the preformulated answer options, each domain had space for individual comments. A total of 87.5% of the students and 91.6% of the examiners agreed or rather agreed that written feedback should continue to be used in upcoming OSCEs. Conclusion The implementation of structured, written feedback in a curricular, summative examination is possible, and examiners and students would like the feedback to be constant.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.