Evidence from various domains underlines the key role that human factors, and especially, trust, play in the adoption of technology by practitioners. In the case of Artificial Intelligence (AI) driven learning analytics tools, the issue is even more complex due to practitioners' AI-specific misconceptions, myths, and fears (i.e., mass unemployment and ethical concerns). In recent years, artificial intelligence has been introduced increasingly into K-12 education. However, little research has been conducted on the trust and attitudes of K-12 teachers regarding the use and adoption of AI-based Educational Technology (EdTech).The present study introduces a new instrument to measure teachers' trust in AI-based EdTech, provides evidence of its internal structure validity, and uses it to portray secondary-level school teachers' attitudes toward AI. First, we explain the instrument items creation process based on our preliminary research and review of existing tools in other domains. Second, using Exploratory Factor Analysis we analyze the results from 132 teachers' input. The results reveal eight factors influencing teachers' trust in adopting AI-based EdTech: Perceived Benefits of AI-based EdTech, AI-based EdTech's Lack of Human Characteristics, AI-based EdTech's Perceived Lack of Transparency, Anxieties Related to Using AI-based EdTech, Self-efficacy in Using AI-based EdTech, Required Shift in Pedagogy to Adopt AI-based EdTech, Preferred Means to Increase Trust in AI-based EdTech, and AI-based EdTech vs Human Advice/Recommendation. Finally, we use the instrument to discuss 132 high-school Biology teachers' responses to the survey items and to what extent they align with the findings from the literature in relevant domains. The contribution of this research is twofold. First, it introduces a reliable instrument to investigate the role of teachers' trust in AI-based EdTech and the factors influencing it. Second, the findings from the teachers' survey can guide creators of teacher professional development courses and policymakers on improving teachers' trust in, and in turn their willingness to adopt, AI-based EdTech in K-12 education.
Evidence from various domains underlines the critical role that human factors, and especially trust, play in adopting technology by practitioners. In the case of Artificial Intelligence (AI) powered tools, the issue is even more complex due to practitioners' AI‐specific misconceptions, myths and fears (e.g., mass unemployment and privacy violations). In recent years, AI has been incorporated increasingly into K‐12 education. However, little research has been conducted on the trust and attitudes of K‐12 teachers towards the use and adoption of AI‐powered Educational Technology (AI‐EdTech). This paper sheds light on teachers' trust in AI‐EdTech and presents effective professional development strategies to increase teachers' trust and willingness to apply AI‐EdTech in their classrooms. Our experiments with K‐12 science teachers were conducted around their interactions with a specific AI‐powered assessment tool (termed AI‐Grader) using both synthetic and real data. The results indicate that presenting teachers with some explanations of (i) how AI makes decisions, particularly compared to the human experts, and (ii) how AI can complement and give additional strengths to teachers, rather than replacing them, can reduce teachers' concerns and improve their trust in AI‐EdTech. The contribution of this research is threefold. First, it emphasizes the importance of increasing teachers' theoretical and practical knowledge about AI in educational settings to gain their trust in AI‐EdTech in K‐12 education. Second, it presents a teacher professional development program (PDP), as well as the discourse analysis of teachers who completed it. Third, based on the results observed, it presents clear suggestions for future PDPs aiming to improve teachers' trust in AI‐EdTech. What is already known about this topic Human factors, and especially trust, play a critical role in practitioners' adoption of technology. In recent years, AI has been incorporated increasingly into K‐12 education. Little research has been conducted on the trust and attitudes of K‐12 teachers towards the use and adoption of AI‐powered Educational Technology. What this paper adds This research emphasizes the importance of increasing teachers' theoretical and practical knowledge about AI in educational settings to gain their trust in AI‐EdTech in K‐12 education. It presents a teacher professional development program (PDP) to increase teachers' trust in AI‐EdTech, as well as the discourse analysis of teachers who completed it. It presents clear suggestions for future PDPs aiming at improving teachers' trust in AI‐EdTech. Implications for practice and/or policy Pre‐ and in‐service teacher education programs that aim to increase teachers' trust in AI‐EdTech should include a section providing teachers with a basic understanding of AI. PDPs aimed to increase teachers' trust in AI‐EdTech should focus on concrete pedagogical tasks and specific AI‐powered tools that are considered by teachers as helpful and worth the effort to learn. AI‐EdTech should not restr...
Evidence from various domains underlines the key role that human factors, and especially, trust, play in the adoption of AI-based technology by professionals. As AI-based educational technology is increasingly entering K-12 education, it is expected that issues of trust would influence the acceptance of such technology by educators as well, but little is known about this matter. In this work, we bring the opinions and attitudes of science teachers that interacted with several types of AI-based technology for K-12. Among other things, our findings indicate that teachers are reluctant to accept AI-based recommendations when it contradicts their previous knowledge about their students and that they expect AI to be absolutely correct even in situations that absolute truth may not exist (e.g., grading open-ended questions). The purpose of this paper is to provide initial findings and start mapping the terrain of this aspect of teacher-AI interaction, which is critical for the wide and effective deployment of AIED technologies in K-12 education.
AI-based educational technology that is designed to support teachers in providing personalized instruction can enhance their ability to address the needs of individual students, hopefully leading to better learning gains. This paper presents results from a participatory research aimed at co-designing with science teachers a learning analytics tool that will assist them in implementing a personalized pedagogy in blended learning contexts. The development process included three stages. In the first, we interviewed a group of teachers to identify where and how personalized instruction may be integrated into their teaching practices. This yielded a clustering-based personalization strategy. Next, we designed a mock-up of an AI-based tool that supports this strategy and worked with another group of teachers to define an 'explainable learning analytics' scheme that explains each cluster in a way that is both pedagogically meaningful and can be generated automatically. Third, we developed an AI algorithm that supports this 'explainable clusters' pedagogy and conducted a controlled experiment that evaluated its contribution to teachers' ability to plan personalized learning sequences.The planned sequences were evaluated in a blinded fashion by an expert, and the results demonstrated that the experimental groupteachers who received the clusters with the explanations -designed sequences that addressed the difficulties exhibited by different groups of students better than those designed by teachers who received the clusters without explanations. The main contribution of this study is twofold. First, it presents an effective personalization approach that fits blended learning in the science classroom, which combines a real-time clustering algorithm with an explainable-AI scheme that can automatically build pedagogically meaningful explanations from item-level meta-data (Q Matrix). Second, it demonstrates how such an end-to-end learning analytics solution can be built with teachers through a co-design process and highlights the types of knowledge that teachers add to system-provided analytics in order to apply them to their local context. As a practical contribution, this process informed the design of a new learning analytics tool that was integrated into a free online learning platform that is being used by more than 1000 science teachers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.