2014
DOI: 10.1609/aimag.v35i2.2537
|View full text |Cite
|
Sign up to set email alerts
|

Workshops Held at the First AAAI Conference on Human Computation and Crowdsourcing: A Report

Abstract: The first AAAI Conference on Human Computation and Crowdsourcing (HCOMP-2013) was be held November 6-9, 2013 in Palm Springs, California. Three workshops took place on Saturday, November 9th: Crowdsourcing at Scale (full day), Human and Machine Learning in Games (full day) and Scaling Speech, Language Understanding and Dialogue through Crowdsourcing (half day). This report summarizes the activities of those three events.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…The sole reliance on paid-human labor for collecting street-level accessibility data can be insufficiently scalable and it remains expensive for creating a large dataset [97].…”
Section: Semi-automated Methods To Collect Accessibility Datamentioning
confidence: 99%
See 1 more Smart Citation
“…The sole reliance on paid-human labor for collecting street-level accessibility data can be insufficiently scalable and it remains expensive for creating a large dataset [97].…”
Section: Semi-automated Methods To Collect Accessibility Datamentioning
confidence: 99%
“…However, even paid micro task crowdsourcing can be insufficiently scalable, and it remains expensive for creating a large dataset [97].…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, several evaluation activities have focused on crowdsourcing for groundtruth creation, as witnessed by the TREC Crowdsourcing track series 1 from 2011 to 2013 [Smucker et al, 2013[Smucker et al, , 2014, the MediaEval Crowdsourcing tracks 2 in 2013 and 2014 [Loni et al, 2013;Yadati et al, 2014], or the CrowdScale 2013 Shared Task Challenge 3 [Josephy et al, 2014]. There is also a growing interest and attention about how crowdsourcing affects the repeatability and reproducibility of IR experiments [Blanco et al, 2011;Ferro, 2017;Ferro et al, 2016a].…”
Section: Crowdsourcing For Ground-truth Creationmentioning
confidence: 99%
“…We will update the existing website for the previous workshops: http://www.worklearn.org/. The website as is provides information on the 2014 Workshop [6].…”
Section: Websitementioning
confidence: 99%