Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2007
DOI: 10.1145/1240624.1240838
|View full text |Cite
|
Sign up to set email alerts
|

What happened to remote usability testing?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
75
0
3

Year Published

2010
2010
2019
2019

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 128 publications
(89 citation statements)
references
References 24 publications
6
75
0
3
Order By: Relevance
“…These results are consistent with those of Tullis et al [39]. In a recent study [41] three methods for remote usability testing and a traditional laboratory based think-aloud method were compared. The three remote methods were a remote synchronous condition, where testing was conducted in real time but the usability evaluator was separated spatially from the test participants, and two remote asynchronous conditions, where the usability evaluator and the test subjects were separated both spatially and temporally.…”
Section: Remote Usability Evaluationsupporting
confidence: 89%
“…These results are consistent with those of Tullis et al [39]. In a recent study [41] three methods for remote usability testing and a traditional laboratory based think-aloud method were compared. The three remote methods were a remote synchronous condition, where testing was conducted in real time but the usability evaluator was separated spatially from the test participants, and two remote asynchronous conditions, where the usability evaluator and the test subjects were separated both spatially and temporally.…”
Section: Remote Usability Evaluationsupporting
confidence: 89%
“…In the cooperative usability testing [26] the user is invited to review the task solving process upon its completion and to reflect on incidents and potential usability problems. In asynchronous remote usability testing the user may be required to self-report incidents or problems, as a substitute of having these identified on the basis of interaction data [27].…”
Section: Uems For Users' Design Feedbackmentioning
confidence: 99%
“…This is, in particular, seen in the five studies of usability testing with self-reports where the users' design feedback consisted mainly of reports of problems or incidents [27,44,[46][47][48]. Here, the users were to run the usability test and report on the usability problems independently of the test administrator, potentially saving evaluation costs.…”
Section: The Budget Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…This has driven businesses and academics to consider remote usability evaluation as an "increasingly important alternative to conventional usability evaluation" [3].…”
Section: Introductionmentioning
confidence: 99%