The purpose of this study was to obtain evidence regarding the validity and reliability of an instrument to measure the self-reported competencies of interprofessional care in interprofessional education programs. Five hundred and eighty-four students and clinicians in Canada and New Zealand who were registered in 15 interprofessional education undergraduate, postgraduate, and continuing professional development programs completed the Interprofessional Collaborative Competency Attainment Survey (ICCAS) using a retrospective pre-test/post-test design. Factor analyses showed the presence of two factors in the pre-program items and one factor in the post-program items. Tests conducted provided evidence in support of the validity and reliability of the ICCAS as a self-assessment instrument for interprofessional collaborative practice. Internal consistency was high for items loading on factor 1 (α = 0.96) and factor 2 (α = 0.94) in the pre-program assessment and for the items in the post-program assessment (α = 0.98). The transition from a two factor solution to a single factor structure suggests interventions influence learners' understanding of interprofessional care by promoting the recognition of the high degree of interrelation among interprofessional care competencies. Scores on the ICCAS are reliable and predict meaningful outcomes with regard to attitudes toward interprofessional competency attainment.
This study replicates a validation of the Interprofessional Collaboration Competency Attainment Survey (ICCAS), a 20-item self-report instrument designed to assess behaviours associated with patient-centred, team-based, collaborative care. We appraised the content validity of the ICCAS for a foundation course in interprofessional collaboration, investigated its internal (factor) structure and concurrent validity, and compared results with those obtained previously by ICCAS authors. Self-assessed competency ratings were obtained from a broad spectrum of pre-licensure, health professions students (n = 785) using a retrospective, pre-/post-design. Moderate to large effect sizes emerged for 16 of 20 items. Largest effects (1.01, 0.94) were for competencies emphasized in the course; the smallest effect (0.35) was for an area not directly taught. Positive correlations were seen between all individual item change scores and a separate item assessing overall change, and item-total correlations were moderate to strong. Exploratory factor analysis was used to understand the interrelationship of ICCAS items. Principal component analysis identified a single factor (Cronbach's alpha = 0.96) accounting for 85% of the total variance-slightly higher than the 73% reported previously. Findings suggest strong overlaps in the proposed constructs being assessed; use of a total average score is justifiable for assessment and evaluation.
Despite the success that instructors and learners often enjoy with online university courses, learners have also reported that they miss face-to-face contact when learning online. The purpose of this inquiry was to identify learners' perceptions of what is missing from online learning and provide recommendations for how we can continue to innovate and improve the online learning experience. The inquiry was qualitative in nature and conducted from a constructivist perspective. Ten learners who had indicated that they missed and/ or would have liked more face-to-face contact following their participation in an online course were interviewed to elicit responses that would provide insights into what it is they miss about face-to-face contact when learning online. Five themes emerged: robustness of online dialogue, spontaneity and improvisation, perceiving and being perceived by the other, getting to know others, and learning to be an online learner. Garrison and colleagues' (Garrison, Anderson, & Archer, 2000) community of inquiry framework was used to interpret the findings.
BackgroundRecognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs.MethodsA working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training.ResultsThe main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies.ConclusionsThe shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied to other aspects of residency curriculum development.
BackgroundAs Family Medicine programs across Canada are transitioning into a competency-based curriculum, medical students and clinical teachers are increasingly incorporating tablet computers in their work and educational activities. The purpose of this pilot study was to identify how preceptors and residents use tablet computers to implement and adopt a new family medicine curriculum and to evaluate how they access applications (apps) through their tablet in an effort to support and enhance effective teaching and learning.MethodsResidents and preceptors (n = 25) from the Family Medicine program working at the Pembroke Regional Hospital in Ontario, Canada, were given iPads and training on how to use the device in clinical teaching and learning activities and how to access the online curriculum. Data regarding the use and perceived contribution of the iPads were collected through surveys and focus groups. This mixed methods research used analysis of survey responses to support the selection of questions for focus groups.ResultsReported results were categorized into: curriculum and assessment; ease of use; portability; apps and resources; and perceptions about the use of the iPad in teaching/learning setting. Most participants agreed on the importance of accessing curriculum resources through the iPad but recognized that these required enhancements to facilitate use. The iPad was considered to be more useful for activities involving output of information than for input. Participants’ responses regarding the ease of use of mobile technology were heterogeneous due to the diversity of computer proficiency across users. Residents had a slightly more favorable opinion regarding the iPad’s contribution to teaching/learning compared to preceptors.ConclusionsiPad’s interface should be fully enhanced to allow easy access to online curriculum and its built-in resources. The differences in computer proficiency level among users should be reduced by sharing knowledge through workshops led by more skillful iPad users. To facilitate collection of information through the iPad, the design of electronic data-input forms should consider the participants’ reported negative perceptions towards typing data through mobile devices. Technology deployment projects should gather sufficient evidence from pilot studies in order to guide efforts to adapt resources and infrastructure to relevant needs of Family Medicine teachers and learners.
In many universities there seems to be an "eLearning Contradiction" between the expressed need to integrate technology into the teaching-learning process and what is actually occurring in the majority of classrooms. In this paper we describe the collaborative process we used to design an online Conceptual Framework Learning Object (C-FLO). The object can be viewed at http://innovation.dc-uoit.ca/cloe/lo/cf/ This account is grounded in practical experiences and supported by the research literature. First, we offer a rationale for the development of C-FLO. We then illustrate how an interdisciplinary collaborative perspective enhanced both the process and learning outcomes. The impact of this learning object from both the learners' and professors' perspectives is detailed. Collaborative projects such as C-FLO, where professors share resources and expertise to improve student learning, could be a first step toward addressing the eLearning Contradiction.
This article addresses one of the most important unresolved issues of interprofessional education (IPE): assessment. Here we describe our process and experiences designing and operationalizing a toolkit of qualitative and quantitative IPE assessment instruments for online and face-to-face education programs developed concurrently in both English and French. The toolkit includes a) the quantitative W(e)Learn program evaluation survey, which aligns with the W(e)Learn framework, b) the quantitative Interprofessional Collaborative Competencies Attainment Survey (ICCAS), to self-assess competency development in collaborative practice using a post-post design, and c) qualitative team and learner contracts, with explanatory exemplars, that serve as both learning and assessment tools. These instruments are currently undergoing validation in hopes of a) increasing the likelihood that IPE experiences are planned and delivered effectively and b) increasing the justification and accountability of IPE experiences and practical outcomes. Although this validation process will continue for some time, the development of the IPE assessment tools is worthy of particular attention in order to guide further work in this field.French and English copies of the toolkit assessments can be downloaded from http://ennovativesolution.com/WeLearn/IPE-Instruments.html. Although these instruments were designed with interprofessional healthcare teams in mind, we feel they could readily be transferable to a variety of interdisciplinary tasks and settings, such as social work and human services education.Keywords: Interprofessional education; Healthcare; Toolkit; Survey; Learner contract Introduction Interprofessional education (IPE) entails engaging professionals to learn with, from, and about each other in order to work more effectively in teams. Although this article addresses interprofessional healthcare education, we feel the processes and products described can be applied to a variety of interdisciplinary tasks and settings, such as social work and human services education.Education and training can teach methods and approaches to increase clinical capacity for interprofessional care (IPC), optimize the use of staff expertise and skills, improve communication among healthcare professionals, and increase the efficiency of case management [1,2]. Researchers have argued that "by learning and working together in educational settings, healthcare professionals will be able to work more effectively with one another in occupational settings" [3]. Barr [4] proposed that IPE is fundamental to a more efficient and effective healthcare system and, ultimately, better patient care.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.