Purpose
Attainment of postgraduate year 1 (PGY1) residency positions has become increasingly competitive. Inclusion of clinical knowledge and problem-solving assessments in onsite interviews has increased in recent years. Characterization of these assessments is necessary for applicants to best prepare for interviews and for mentors to provide guidance.
Methods
An online survey was emailed to program directors of PGY1 pharmacy residency programs accredited by the American Society of Health-System Pharmacists (ASHP). Data were analyzed using descriptive statistics. Chi-square and Fisher’s exact tests were used to compare categorical data. The Mann-Whitney U test was used to analyze nonparametric continuous data.
Results
Of the 221 respondents, most identified their programs as based at community (48%) or academic (39%) medical centers. Ninety percent of programs reported inclusion of clinical knowledge and problem-solving assessments in the onsite interview process. The most common assessments included asking clinical questions (70%), development of a SOAP (subjective, objective, assessment, plan) note or care plan (42%), and formal presentations that applicants prepared prior to arrival (39%). Most programs (71%) reported incorporating multiple assessments, with 2 assessments included most commonly (43%). Clinical assessment performance accounted for 10% to 25% of the overall interview score in approximately half of programs.
Conclusion
During onsite PGY1 residency interviews, applicants must be prepared to participate in at least 1 clinical knowledge and problem-solving assessment, including answering clinical questions, developing a SOAP note or care plan, and/or delivering a presentation. Applicants should expect that these assessments will account for a substantial portion of the interview evaluation.
Team-based Learning (TBL) is widely used in pharmacy education. There is debate regarding the necessity of graded readiness assurance tests (RATs) as incentive to complete pre-class preparation. The purpose of this study was to determine the effect of graded vs ungraded RATs on exam performance in an Ambulatory Care elective course for third year student pharmacists. Methods. For the course offered in Spring 2020 and 2021, standard TBL framework was employed. RATs were graded in 2020 (graded RAT cohort), but did not contribute to overall course grade in 2021 (ungraded RAT cohort). An online anonymous survey of students determining class preparation and perceived team accountability was administered at course completion in the ungraded RAT cohort. Results. There was no significant difference between the graded RAT (n=47) and ungraded RAT cohorts (n=36) in overall mean percentage score on individual RATs (76% vs 74%) and individual exams (82 vs 80%). Most students (69-91%) in the ungraded RAT cohort reported completing pre-class preparation assignments. Ninety-four percent agreed or strongly agreed they contributed to team members' learning and 86% agreed or strongly agreed they were proud of their ability to assist in the team's learning. Conclusion. Ungraded RATs did not significantly impact student exam performance in an elective course. Removal of this assessment that promotes the performance-approach to learning may have contributed to a shift in motivation to the mastery-approach in the context of pre-class preparation. This challenges a widely held belief that grades are necessary incentives for pre-class preparation within TBL.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.