An e-assessment strategy, consisting of multiple choice questions (MCQ) within a Moodle learning management system (LMS) was implemented in a higher education institution to assist teachers with the assessment process. Considering that the new procedure induced major changes to the existing assessment, affecting students, teachers and staff, this paper aims to analyse teachers' perceptions through a qualitative study, applying a focus group constituted by senior lecturers who used MCQ tests in their courses. Resulting in the identification of advantages and disadvantages, eliciting improvements and exploring whether MCQ develops and evaluates the same knowledge and skills compared to existing solutions. In sum, the study's main result was the confirmation that e-assessment with MCQ remains reliable in any degree program course, although preferably when combined with another kind of assessment, such as a problem-based group project, which is effective in developing competencies and skills that MCQ are not.
Abstract-ISCAP's Information Systems Department is composed of about twenty teachers who have, for several years, been using an e-learning environment (Moodle) combined with traditional assessment. A new e-assessment strategy was implemented recently in order to evaluate a practical topic, the use of spreadsheets to solve management problems. This topic is common to several courses of different undergraduate degree programs. Being e-assessment an outstanding task regarding theoretical topics, it becomes even more challenging when the topics under evaluation are practical. In order to understand the implications of this new type of assessment from the viewpoint of the students, questionnaires and interviews were undertaken. In this paper the analysis of the questionnaires are presented and discussed.
Aim/Purpose: The aim of this study is to understand student’s opinions and perceptions about e-assessment when the assessment process was changed from the traditional computer assisted method to a multiple-choice Moodle based method. Background: In order to implement continuous assessment to a large number of students, several shifts are necessary, which implies as many different tests as the number of shifts required. Consequently, it is difficult to ensure homogeneity through the different tests and a huge amount of grading time is needed. These problems related to the traditional assessment based on computer assisted tests, lead to a re-design of the assessment resulting in the use of multiple-choice Moodle tests. Methodology: A longitudinal, concurrent, mixed method study was implemented over a five-year period. A survey was developed and carried out by 815 undergraduate students who experienced the electronic multiple-choice questions (eMCQ) assessment in the courses of the IS department. Qualitative analyses included open-ended survey responses and interviews with repeating students in the first year. Contribution: This study provides a reflection tool on how to incorporate frequent moments of assessment in courses with a high number of students without overloading teachers with a huge workload. The research analysed the efficiency of assessing non-theoretical topics using eMCQ, while ensuring the homogeneity of assessment tests, which needs to be complemented with other assessment methods in order to assure that students develop and acquire the expected skills and competencies. Findings: The students involved in the study appreciate the online multiple-choice quiz assessment method and perceive it as fair but have a contradictory opinion regarding the preference of the assessment method, throughout the years. These changes in perception may be related to the improvement of the question bank and categorisation of questions according to difficulty level, which lead to the nullification of the ‘luck factor’. Other major findings are that although the online multiple-choice quizzes are used with success in the assessment of theoretical topics, the same is not in evidence regarding practical topics. Therefore, this assessment needs to be complemented with other methods in order to achieve the expected learning outcomes. Recommendations for Practitioners: In order to be able to evaluate the same expected learning outcomes in practical topics, particularly in technology and information systems subjects, the evaluator should complement the online multiple-choice quiz assessment with other approaches, such as a PBL method, homework assignments, and/or other tasks performed during the semester. Recommendation for Researchers: This study explores e-assessment with online multiple-choice quizzes in higher education. It provides a survey that can be applied in other institutions that are also using online multiple-choice quizzes to assess non-theorical topics. In order to better understand the students’ opinions on the development of skills and competencies with online multiple-choice quizzes and on the other hand with classical computer assisted assessment, it would be necessary to add questions concerning these aspects. It would then be interesting to compare the findings of this study with the results from other institutions. Impact on Society: The increasing number of students in higher education has led to a raised use of e-assessment activities, since it can provide a fast and efficient manner to assess a high number of students. Therefore, this research provides meaningful insight of the stakeholders’ perceptions of online multiple-choice quizzes about practical topics. Future Research: An interesting study, in the future, would be to obtain the opinions of a particular set of students on two tests, one of the tests using online multiple-choice quizzes and the other through a classical computer assisted assessment method. A natural extension of the present study is a comparative analysis regarding the grades obtained by students who performed one or another type of assessment (online multiple-choice quizzes vs. classical computer assisted assessment).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.