Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work &Amp; Social Computing 2015
DOI: 10.1145/2675133.2675246
|View full text |Cite
|
Sign up to set email alerts
|

LabintheWild

Abstract: Web-based experimentation with uncompensated and unsupervised samples has the potential to support the replication, verification, extension and generation of new results with larger and more diverse sample populations than previously seen. We introduce the experimental online platform LabintheWild, which provides participants with personalized feedback in exchange for participation in behavioral studies. In comparison to conventional in-lab studies, LabintheWild enables the recruitment of participants at large… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 97 publications
(36 citation statements)
references
References 37 publications
0
31
0
Order By: Relevance
“…Instead of offering monetary compensation, these platforms motivate participation by promising participants that they will receive their results immediately at the end of the study and will be able to compare their own performance to others. Several validation studies demonstrated that data collected on those platforms are not statistically different from the data collected in conventional laboratory settings [18,19,29]. Despite the fact that participants self-select to take part in studies on these platforms, the samples appear more diverse (and thus more representative of the general population) than the samples that participate in studies in conventional academic laboratories or populations recruited via MTurk [29].…”
Section: Conducting Behavioral Research With Unpaid Online Volunteersmentioning
confidence: 99%
“…Instead of offering monetary compensation, these platforms motivate participation by promising participants that they will receive their results immediately at the end of the study and will be able to compare their own performance to others. Several validation studies demonstrated that data collected on those platforms are not statistically different from the data collected in conventional laboratory settings [18,19,29]. Despite the fact that participants self-select to take part in studies on these platforms, the samples appear more diverse (and thus more representative of the general population) than the samples that participate in studies in conventional academic laboratories or populations recruited via MTurk [29].…”
Section: Conducting Behavioral Research With Unpaid Online Volunteersmentioning
confidence: 99%
“…The success of previous initiatives aimed at recruiting people to experiments using data as a reward suggested that this would be possible (Reinecke et al, 2015) however, the current study shows that this is also possible even without a carefully designed "fun" sounding experiment. This suggests that QS-style recruitment might be possible for a wide range of experiments, not just those that will provide users with particularly novel data about themselves.…”
Section: Participation Levelsmentioning
confidence: 71%
“…The Test My Brain site specifically states that "the study should be fun for our participants" if any researchers wish to submit their own experiments. An investigation into participant motivation shows participants take part because they find the studies fun and enjoy comparing their results to others (Reinecke, Arbor, & Gajos, 2015). It appears that enjoyment is key for participation in online experiments such as these.…”
Section: Quantified Selfmentioning
confidence: 99%
“…The number of available workers with VR devices continues to grow, and will allow the scale of such experiments to expand as well. Of course, expanding the experiments beyond AMT and even beyond crowdsourcing platforms using other online experiment recruiting methods could also be feasible and scalable [40].…”
Section: Discussionmentioning
confidence: 99%
“…Online experiments are not limited, of course, to crowdsourced environments, and have long been an acceptable tool in behavioral research [24,44]. Some recent innovations, however, involve new mechanisms of recruitment, distribution and data collection, for example by volunteers (in return for feedback on performance) [40]. While such mechanisms will be increasingly available for research in VR, our focus remains on online VR experiments via crowdsourcing on AMT.…”
Section: Running Online Experimentsmentioning
confidence: 99%