Proceedings of the 2007 Annual Research Conference of the South African Institute of Computer Scientists and Information Techno 2007
DOI: 10.1145/1292491.1292507
|View full text |Cite
|
Sign up to set email alerts
|

A comparative study of two usability evaluation methods using a web-based e-learning application

Abstract: Usability evaluation of e-learning applications is a maturing area, which addresses interfaces, usability and interaction from human-computer interaction (HCI) and pedagogy and learning from education. The selection of usability evaluation methods (UEMs) to determine usability problems is influenced by time, cost, efficiency, effectiveness, and ease of application. Heuristic evaluation (HE) involves evaluation by experts with expertise in the domain area and/or HCI. This comparative evaluation study investigat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
66
2

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 90 publications
(72 citation statements)
references
References 30 publications
(37 reference statements)
4
66
2
Order By: Relevance
“…The Co-discovery Learning method was found to be slightly more effective than the others. − Ssemugabi and De Villiers [2007] reported a case study whose aim was to investigate the extent to which Heuristic Evaluation identifies usability problems in a Web-based learning application by comparing the results with those of Survey Evaluations among end-users. The Heuristic Evaluation performed by four expert evaluators proved to be an appropriate and effective usability evaluation method for e-learning applications.…”
Section: Empirical Studies Involving Usability Inspection Methods Formentioning
confidence: 99%
See 1 more Smart Citation
“…The Co-discovery Learning method was found to be slightly more effective than the others. − Ssemugabi and De Villiers [2007] reported a case study whose aim was to investigate the extent to which Heuristic Evaluation identifies usability problems in a Web-based learning application by comparing the results with those of Survey Evaluations among end-users. The Heuristic Evaluation performed by four expert evaluators proved to be an appropriate and effective usability evaluation method for e-learning applications.…”
Section: Empirical Studies Involving Usability Inspection Methods Formentioning
confidence: 99%
“…− HE covers a broader range of usability aspects than other inspection methods such as, for instance, Cognitive Walkthroughs, whose usability definition is more focused on ease of navigation. − HE has provided useful results when used to conduct Web usability evaluations [Sutcliffe 2002;Allen et al 2006;Ssemugabi and De Villiers 2007]. − HE has often been used for comparison with other inspection methods [Costabile and Matera 2001;Chattratichart and Brodie 2004;Conte et al 2009].…”
Section: Methods Evaluatedmentioning
confidence: 99%
“…Existing questionnaire methods relating to the usability of E-learning applications cover several areas. For example, the researchers in [22] divided their questioning into three categories: general interface usability criteria, website-specific criteria for educational websites, and learner-centred instructional design that is grounded in learning theory and aims to provide effective learning. Additionally, the author in [6] used a questionnaire approach to assess ten areas: general system performance, software installation, manuals and online help, online tutorials, multimedia quality, information presentation, navigation, terminology and error messages, learnability, and overall system evaluation.…”
Section: The Use Of Questionnaires In Usability Assessmentmentioning
confidence: 99%
“…Academic groups used forms of collaborative crowdsourced learning to study the effectiveness of student's assessment and peer instruction [7][10] [13][14] [16] . de Alfaro [7] created a crowdsourced grading tool, CrowdGrader, that allows students to grade and review their peer's homework submissions.…”
Section: Introductionmentioning
confidence: 99%
“…Ghauth concluded that with this recommendation system and rating system, 12.16% of users showed a significant improvement in understanding topics. Ssemugabi [16] compared two usability evaluation methods of web-based learning applications between 4 expert evaluators and 61 learners. Ssemugabi discovered that the expert evaluators identified 91% more problems with the web-based e-learning application than the learners.…”
Section: Introductionmentioning
confidence: 99%