Surveys are ubiquitous in medical education. They can be valuable for assessment across a wide range of applications and are frequently used in medical education research. This Educator's Blueprint paper reviews the best practices in survey design with a focus on survey development. Key components of the survey design process include determining whether a survey is the right tool, using an intentional approach to content development, and following best practices in item writing and formatting. These processes are meant to help educators and researchers design better surveys for making better decisions.
Individualized interactive instruction provides an opportunity for significant innovation and advances in curriculum design. We describe the development and implementation of virtual small group exercises into the curriculum of an emergency medicine residency training program using a free social media and communication platform (Slack). Two virtual small group exercises, one case-based and one open-ended, were trialed during the 2016 to 2017 academic year. We found that the exercises were feasible to implement in a learner group where 66% (41/ 62) had little or no prior experience with Slack. There was a trend toward a more favorable rating of the quality of the dialogue and of the task-technology fit for the case-based format as opposed to the open-ended educational activity.
Surveys are descriptive assessment tools. Like other assessment tools, the validity and reliability of the data obtained from surveys depend, in large part, on the rigor of the development process. Without validity evidence, data from surveys may lack meaning, leading to uncertainty as to how well the survey truly measures the intended constructs. In documenting the evidence for the validity of survey results and their intended use, it is incumbent on the survey creator to have a firm understanding of validity frameworks. Having an understanding of validity evidence and how each step in the survey development process can support the validity argument makes it easier for the researcher to develop, implement, and publish a high‐quality survey.
In this study we explore the biographical disruption resulting from a diagnosis of an abnormal Pap smear and the consequent process of biographical reconstruction. This is a qualitative study of thirteen women between the ages of 19 and 54 years who were diagnosed with an abnormal Pap smear and underwent colposcopy treatment. Data collection was through individual in-depth interviews, which were transcribed and analyzed by a team of researchers for important themes. An opportunistic sampling strategy was used. The inherent ambiguity in the diagnosis, its treatment strategies, the prognosis of their condition, and patients' fear of cancer all made the process of biographical reconstruction more problematic. By putting their faith in medicine and mobilizing their personal resources, women attempted to reestablish a positive personal and social self. Further exploration of the long-term psychosocial impact of this diagnosis is warranted so that women's emotional as well as medical needs are adequately addressed.
Background: Delivering quality lectures is a critical skill for residents seeking careers in academia yet no validated tools for assessing resident lecture skills exist.Objectives: The authors sought to develop and validate a lecture assessment tool.Methods: Using a nominal group technique, the authors derived a behaviorally anchored assessment tool.Baseline characteristics of resident lecturers including prior lecturing experience and perceived comfort with lecturing were collected. Faculty and senior residents used the tool to assess lecturer performance at weekly conference. A postintervention survey assessed the usability of the form and the quantity and quality of the feedback. Analysis of variance was used to identify relationships in performance within individual domains to baseline data. Generalizability coefficients and scatterplots with jitter were used to assess inter-rater reliability.Results: Of 64 residents assessed, most (68.8%) had previous lecturing experience and 6.3% had experience as a regional/national speaker. There was a significant difference in performance within the domains of Content Expertise (p < 0.001), Presentation Design/Structure (p = 0.014), and Lecture Presence (p = 0.001) for first-year versus fourth-year residents. Residents who had higher perceived comfort with lecturing performed better in the domains of Content Expertise (p = 0.035), Presentation Design/Structure (p = 0.037), and Lecture Presence (p < 0.001). We found fair agreement between raters in all domains except Goals and Objectives. Both lecturers and evaluators perceived the feedback delivered as specific and of adequate quantity and quality. Evaluators described the form as highly useable. Conclusions:The derived behaviorally anchored assessment tool is a sufficiently valid instrument for the assessment of resident-delivered lectures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.