2012
DOI: 10.1002/meet.14504901100
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing for usability testing

Abstract: While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming. The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn-around, and significant cost savings. To investigate the potential of such crowdsourced usability testing, we conducted a usability study which evaluated a graduate school's webs… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
38
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 67 publications
(40 citation statements)
references
References 15 publications
1
38
0
Order By: Relevance
“…We carefully designed our study to remove privacy biases by making tasks uniform for all MTurkers and assessing only usability. Moreover, while the quality of results from tests on MTurk are not as good as lab-based testing, Liu et al [35] found that MTurk was a viable platform for usability testing. A general concern with using platforms such as MTurk is gaming by the workers.…”
Section: Limitationsmentioning
confidence: 99%
“…We carefully designed our study to remove privacy biases by making tasks uniform for all MTurkers and assessing only usability. Moreover, while the quality of results from tests on MTurk are not as good as lab-based testing, Liu et al [35] found that MTurk was a viable platform for usability testing. A general concern with using platforms such as MTurk is gaming by the workers.…”
Section: Limitationsmentioning
confidence: 99%
“…By contrast, the intent of our approach is to solve a newly encountered problem in crowdsourced testing. Liu et al [36] investigated both methodological differences and empirical contrasts between crowdsourced usability testing and traditional face-to-face usability testing. To solve the oracle problem, Pastore et al [37] applied the crowdsourcing technique to generate test inputs depending on a test oracle that required human input in one form or another.…”
Section: Crowdsourced Software Testingmentioning
confidence: 99%
“…They use Amazon's Mechanical Turk (MTurk) as a tool that allows them to easily manage crowdsourced studies, perform prerequisite qualification tests for filtering participants, ensure privacy, manage payments, and collect results. The authors in [16], used crowdsourcing and MTurk platform in evaluating the usability of a school website. The advantages are claimed to be more participants' involvement, low cost, high speed, and various users' backgrounds, while the disadvantages include lower quality feedback, less interactions, more spammers, less focused user groups.…”
Section: Related Workmentioning
confidence: 99%