2017
DOI: 10.1109/tsp.2016.2630038
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Object Classification via Crowdsourcing With a Reject Option

Abstract: Abstract-Consider designing an effective crowdsourcing system for an M -ary classification task. Crowd workers complete simple binary microtasks whose results are aggregated to give the final result. We consider the novel scenario where workers have a reject option so they may skip microtasks when they are unable or choose not to respond. For example, in mismatched speech transcription, workers who do not know the language may not be able to respond to microtasks focused on phonological dimensions outside thei… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
29
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(29 citation statements)
references
References 37 publications
(42 reference statements)
0
29
0
Order By: Relevance
“…In this paper, we extend our work [18,19] by further taking the spammers' effect on the system into consideration. We study the scenario where spammers also exist in the crowd, who participate in the task only to earn some free money without regard to the quality of their answers.…”
Section: Introductionmentioning
confidence: 94%
See 2 more Smart Citations
“…In this paper, we extend our work [18,19] by further taking the spammers' effect on the system into consideration. We study the scenario where spammers also exist in the crowd, who participate in the task only to earn some free money without regard to the quality of their answers.…”
Section: Introductionmentioning
confidence: 94%
“…In our previous work [18,19], we proposed a novel weighted majority voting method for crowdsourced classification, which was derived by solving the following optimization problem…”
Section: Crowdsourcing With a Reject Optionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent work on classification in crowdsourcing systems, complex questions are often replaced by a set of simpler binary questions (microtasks) to enhance classification performance [1]- [4]. This is especially helpful in situations where crowd workers lack expertise for responding to complex questions directly.…”
Section: Introductionmentioning
confidence: 99%
“…These binary questions can be posted as "microtasks" on crowdsourcing platforms like Amazon Mechanical Turk [5]. To improve classification performance in crowdsourcing systems, most of the works in the literature focus on enhancing the quality of individual tests, by designing fusion rules to combine decisions from heterogeneous workers [1]- [4], [6], [7], and by investigating the assignment of different tests to different workers depending upon their skill level [8], [9]. These problems have also been extended to budget-constrained environments to improve classification performance [10]- [12].…”
Section: Introductionmentioning
confidence: 99%