Globally, the advent and rapid spread of the COVID-19 virus has created significant disruption to health professions education and practice, and consequently interprofessional education, leading to a model of learning and practicing where much is unknown. Key questions for this ongoing evolution emerge for the global context leading to reflections on future directions for the interprofessional education field and its role in shaping future practice models. Health professions programs around the world have made a dramatic shift to virtual learning platforms in response to closures of academic institutions and restrictions imposed on learners accessing practice settings. Telemedicine, slow to become established in many countries to date, has also revolutionized practice in the current environment. Within the state of disruption and rapid change is the awareness of a silver lining that provides an opportunity for future growth. Key topics explored in this commentary include reflection on the application of existing competency frameworks, consideration of typology of team structures, reconsideration of theoretical underpinnings, revisiting of core dimensions of education, adaptation of interprofessional education activities, and the role in the future pandemic planning. As an international community of educators and researchers, the authors consider current observations relevant to interprofessional education and practice contexts and suggest a response from scholarship voices across the globe. The current pandemic offers a unique opportunity for educators, practitioners, and researchers to retain what has served interprofessional education and practice well in the past, break from what has not worked as well, and begin to imagine the new.
Objectives
An interprofessional group of health colleges’ faculty created and piloted the Barriers to Error Disclosure Assessment (BEDA) Tool, as an instrument to measure barriers to medical error disclosure among health care providers.
Methods
A review of the literature guided the creation of items describing influences on the decision to disclose a medical error. Local and national experts in error disclosure used a modified Delphi process to gain consensus on the items included in the pilot. After receiving University Institutional IRB approval researchers distributed the tool to a convenience sample of physicians (n = 19), pharmacists (n=20), and nurses (n=20) from an academic medical center. Means and standard deviations were used to describe the sample. Intra-class correlations (ICCs) were used to examine test-retest correspondence between the continuous items on the scale. Factor analysis with Varimax rotation was used to determine factor loadings and examine internal consistency reliability. Cronbach alpha coefficients were calculated during initial and subsequent administrations to assess test-retest reliability.
Results
After omitting two items with intra-class correlations < 0.40, ICCs ranged from 0.43–0.70 indicating fair to good test-retest correspondence between the continuous items on the final draft. Factor analysis revealed the following factors during the initial administration: confidence and knowledge barriers, institutional barriers, psychological barriers, and financial concern barriers to medical error disclosure. Alpha coefficients of 0.85–0.93 at time 1 and 0.82–0.95 at time 2 supported test-retest reliability.
Conclusions
The final version of the 31-item tool can be used to measure perceptions about abilities for disclosing, impressions regarding institutional policies and climate, and specific barriers that inhibit disclosure by health care providers. Preliminary evidence supports the tool’s validity and reliability for measuring disclosure variables.
Forty faculty members from eight schools participated in a year-long National Faculty Development Program (NFDP), conducted in 2012-2013, aimed at developing faculty knowledge and skills for interprofessional education (IPE). The NFDP included two live conferences. Between conferences, faculty teams implemented self-selected IPE projects at their home institutions and participated in coaching and peer-support conference calls. This paper describes program outcomes. A mixed methods approach was adopted. Data were gathered through online surveys and semi-structured interviews. The study explored whether faculty were satisfied with the program, believed the program was effective in developing knowledge and skills in designing, implementing and evaluating IPE, and planned to continue newlyimplemented IPE and faculty development. Peer support and networking were two of the greatest perceived benefits. Further, this multi-institutional program appears to have facilitated early organizational change by bringing greater contextual understanding to assumptions made at the local level that in turn could influence hidden curricula and networking. These findings may guide program planning for future faculty development to support IPE.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.