Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems 2017
DOI: 10.1145/3025453.3025820
|View full text |Cite
|
Sign up to set email alerts
|

Differences in Crowdsourced vs. Lab-based Mobile and Desktop Input Performance Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
30
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(31 citation statements)
references
References 24 publications
1
30
0
Order By: Relevance
“…In all, despite some statistically significant differences between the effects of novel and repeat primes, the effects are very small. This may, in part, reflect a general tendency for less-controlled participants to perform tasks faster with less accuracy than lab participants [27], although our visiblity tasks results indicate that some pre-existing response behaviours (e.g. to press the right-hand button) may have also influenced the experiment.…”
Section: Table 14 Results Summarymentioning
confidence: 74%
“…In all, despite some statistically significant differences between the effects of novel and repeat primes, the effects are very small. This may, in part, reflect a general tendency for less-controlled participants to perform tasks faster with less accuracy than lab participants [27], although our visiblity tasks results indicate that some pre-existing response behaviours (e.g. to press the right-hand button) may have also influenced the experiment.…”
Section: Table 14 Results Summarymentioning
confidence: 74%
“…Along the same line, in [28] authors showed that rewarding workers when they quit their participation in a batch of HITs allows to filter out low-quality workers early, thus retaining only highly accurate workers. Recently, Findlater et al showed that results of online HCI experiments are similar to those achieved in the lab for desktop interactions, but this was less so in the case of mobile devices [20].…”
Section: Worker Differences and Participation Biasmentioning
confidence: 94%
“…Studies show that the mobile Web users perform different tasks compared to desktop Web users [9,27]. Findlater et al show that user interactions change between lab versus crowdsourced studies, and depend on whether the study is done on mobile or desktop devices [15]. XDBrowser [33], provides a new cross-device Web browser which can automatically translate a page design from one device type to another.…”
Section: Mobile Versus Non-mobile Quantificationmentioning
confidence: 99%