Proceedings of the 24th International Conference on Intelligent User Interfaces 2019
DOI: 10.1145/3301275.3302276
|View full text |Cite
|
Sign up to set email alerts
|

Explainable modeling of annotations in crowdsourcing

Abstract: Aggregation models for improving the quality of annotations collected via crowdsourcing have been widely studied, but far less has been done to explain why annotators make the mistakes that they do. To this end, we propose a joint aggregation and worker clustering model that detects patterns underlying crowd worker labels to characterize varieties of labeling errors. We evaluate our approach on a Named Entity Recognition dataset labeled by Mechanical Turk workers in both a retrospective experiment and a small … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 13 publications
(8 reference statements)
0
2
0
Order By: Relevance
“…• Algorithms based on machine learning and pattern recognition: these algorithms are one of the main answer aggregation algorithms in the field of annotation. For example, 21 found the final answer through clustering. Also, in 22 , it detects spam text messages using SVM and Bayesian filters.…”
Section: Steps Of a Crowd-sourcing Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…• Algorithms based on machine learning and pattern recognition: these algorithms are one of the main answer aggregation algorithms in the field of annotation. For example, 21 found the final answer through clustering. Also, in 22 , it detects spam text messages using SVM and Bayesian filters.…”
Section: Steps Of a Crowd-sourcing Systemmentioning
confidence: 99%
“…Therefore, except for people who are very interested in competition, other people will not have the motivation to participate 25 . Another method is to pay a fixed amount to all the users 21,26 , which reduces the number of calculations to find the payment amount and gives everyone an incentive to participate in the task. Also, this method does not incentivize people to cheat for more money.…”
Section: Steps Of a Crowd-sourcing Systemmentioning
confidence: 99%