2017
DOI: 10.1016/j.procs.2017.10.102
|View full text |Cite
|
Sign up to set email alerts
|

Altruistic Crowdsourcing for Arabic Speech Corpus Annotation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…We can find several ways to achieve annotation with. Annotation by 2-5 persons having some specified skills (Alotaibi and Anderson, 2016) (Pustejovsky and Stubbs, 2012), Crowdsourcing where the annotation is done by an important number of annotators without specific skills (Bougrine et al, 2017), or Annotation based on rating systems offered by opinion sites (Rushdi-Saleh et al, 2011).…”
Section: Matter Approachmentioning
confidence: 99%
“…We can find several ways to achieve annotation with. Annotation by 2-5 persons having some specified skills (Alotaibi and Anderson, 2016) (Pustejovsky and Stubbs, 2012), Crowdsourcing where the annotation is done by an important number of annotators without specific skills (Bougrine et al, 2017), or Annotation based on rating systems offered by opinion sites (Rushdi-Saleh et al, 2011).…”
Section: Matter Approachmentioning
confidence: 99%
“…While online crowdsourcing markets make it convenient to pay for workers willing to solve a range of diferent tasks, they suffer from limitations such as not attracting enough workers with desired background or skills [4,21,54]. For example, it can be a challenge to recruit workers for a task that requires workers who speak a specifc language or who live in a certain city [10,42] . Situated crowdsourcing can help fll in the gaps in these scenarios where the crowd needs to be associated with some context.…”
Section: Related Work 21 Situated Crowdsourcingmentioning
confidence: 99%
“…As another example, altruistic crowdsourcing refers to cases where unpaid tasks are carried out by a large number of volunteer contributors [10]. This form of crowdsourcing often utilizes members of the same community to complete collective tasks or getting better quality and more trustworthy information [10,27].…”
Section: Community-situated Crowdsourcing: Generic Platforms Versus Targeted Online Communitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…At document level [32], at sentence level [10] or at word level, also known as Part Of Speech tagging (POS) [22,34]. We can find several ways to achieve annotation with; annotation by 2 to 5 persons having some specified skills [5,29], Crowdsourcing where the annotation is done by an important number of annotators without specific skills [9] or annotation based on rating systems offered by opinion sites [32]. The final version of the annotated data, called the gold standard, is the corpus to be used in the classification step [29].…”
Section: Matter Approachmentioning
confidence: 99%