Context Physicians depend on the medical literature to keep current with clinical information. Little is known about residents' ability to understand statistical methods or how to appropriately interpret research outcomes. Objective To evaluate residents' understanding of biostatistics and interpretation of research results. Design, Setting, and Participants Multiprogram cross-sectional survey of internal medicine residents. Main Outcome Measure Percentage of questions correct on a biostatistics/study design multiple-choice knowledge test. Results The survey was completed by 277 of 367 residents (75.5%) in 11 residency programs. The overall mean percentage correct on statistical knowledge and interpretation of results was 41.4% (95% confidence interval [CI], 39.7%-43.3%) vs 71.5% (95% CI, 57.5%-85.5%) for fellows and general medicine faculty with research training (PϽ.001). Higher scores in residents were associated with additional advanced degrees (50.0% [95% CI, 44.5%-55.5%] vs 40.1% [95% CI, 38.3%-42.0%]; PϽ.001); prior biostatistics training (45.2% [95% CI, 42.7%-47.8%] vs 37.9% [95% CI, 35.4%-40.3%]; P=.001); enrollment in a university-based training program (43.0% [95% CI, 41.0%-45.1%] vs 36.3% [95% CI, 32.6%-40.0%]; P=.002); and male sex (44.0% [95% CI, 41.4%-46.7%] vs 38.8% [95% CI, 36.4%-41.1%]; P = .004). On individual knowledge questions, 81.6% correctly interpreted a relative risk. Residents were less likely to know how to interpret an adjusted odds ratio from a multivariate regression analysis (37.4%) or the results of a Kaplan-Meier analysis (10.5%). Seventy-five percent indicated they did not understand all of the statistics they encountered in journal articles, but 95% felt it was important to understand these concepts to be an intelligent reader of the literature. Conclusions Most residents in this study lacked the knowledge in biostatistics needed to interpret many of the results in published clinical research. Residency programs should include more effective biostatistics training in their curricula to successfully prepare residents for this important lifelong learning skill.
Background The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
Although transfers of care are increasingly frequent, few internal medicine residency programs have comprehensive transfer of care systems in place, and most do not provide formal training in sign-out skills to all residents.
Residents face several EBM barriers, some of which are unique to their status as trainees. While increased informatics training and reliable, rapid, and point-of-care access to electronic information resources remain necessary, they are not sufficient to help residents practice EBM. Educators must also attend to their attitudes toward learning and to the influence of programmatic and institutional cultures.
We developed an oral sign-out curriculum that was brief, structured, and well received by participants. Further study is necessary to determine the long-term impact of the curriculum.
OBJECTIVE:To develop and implement an evidence-based medicine (EBM) curriculum and determine its effectiveness in improving residents' EBM behaviors and skills.
DESIGN:Description of the curriculum and a multifaceted evaluation, including a pretest-posttest controlled trial.
SETTING:University-based primary care internal medicine residency program.
PARTICIPANTS:Second-and third-year internal medicine residents ( N ؍ 34).
INTERVENTIONS:A 7-week EBM curriculum in which residents work through the steps of evidence-based decisions for their own patients. Based on adult learning theory, the educational strategy included a resident-directed tutorial format, use of real clinical encounters, and specific EBM facilitating techniques for faculty.
MEASUREMENTS AND MAIN RESULTS:Behaviors and selfassessed competencies in EBM were measured with questionnaires. Evidence-based medicine skills were assessed with a 17-point test, which required free text responses to questions based on a clinical vignette and a test article. After the intervention, residents participating in the curriculum (case subjects) increased their use of original studies to answer clinical questions, their examination of methods and results sections of articles, and their self-assessed EBM competence in three of five domains of EBM, while the control subjects did not. The case subjects significantly improved their scores on the EBM skills test (8.5 to 11.0, p ؍ .001), while the control subjects did not (8.5 to 7.1, p ؍ .09). The difference in the posttest scores of the two groups was 3.9 points ( p ؍ .001, 95% confidence interval 1.9, 5.9).CONCLUSIONS: An EBM curriculum based on adult learning theory improves residents' EBM skills and certain EBM behaviors. The description and multifaceted evaluation can guide medical educators involved in EBM training.
These reports provide useful guides for medical educators, but many suffered from incomplete descriptions and inadequate evaluations of their curricula. The curricula themselves often focused on critical appraisal to the exclusion of other EBM skills and had limited effectiveness. In addition to increased methodologic rigor, future studies should focus on more meaningful outcome evaluations. Curricula should use residents' actual clinical experiences and teach EBM skills in real time in existing clinical and educational venues.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.