2012
DOI: 10.1007/978-3-642-33876-2_3
|View full text |Cite
|
Sign up to set email alerts
|

Nichesourcing: Harnessing the Power of Crowds of Experts

Abstract: Abstract. In this position paper we identify nichesourcing, a specific form of human-based computation that harnesses the computational efforts from niche groups rather than the "faceless crowd". We claim that nichesourcing combine the strengths of the crowd with those of professionals, optimizing the result of human-based computation for certain tasks. We illustrate our claim using scenarios in two domains: cultural heritage and regreening in Africa. The contribution of this paper is to provide a definition o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 34 publications
(21 citation statements)
references
References 11 publications
(11 reference statements)
0
20
0
1
Order By: Relevance
“…Nevertheless, it may be worthwhile to look for ways to surpass these disadvantages, because the application of social tagging may engage audiences and augment awareness of heritage collections (Springer et al, ), create different access points (Lu et al, , p. 764; Thøgersen, ) that help increasing indexer‐searcher consistency, and may complement automatic annotations (Freiburg et al, ). One initiative to improve tag quality is nichesourcing , a form of human computation that takes advantage of social tagging but involves experts, as opposed to crowdsourcing , in which taggers are the general public with no specific knowledge of a given domain (De Boer et al, ).…”
Section: Related Workmentioning
confidence: 99%
“…Nevertheless, it may be worthwhile to look for ways to surpass these disadvantages, because the application of social tagging may engage audiences and augment awareness of heritage collections (Springer et al, ), create different access points (Lu et al, , p. 764; Thøgersen, ) that help increasing indexer‐searcher consistency, and may complement automatic annotations (Freiburg et al, ). One initiative to improve tag quality is nichesourcing , a form of human computation that takes advantage of social tagging but involves experts, as opposed to crowdsourcing , in which taggers are the general public with no specific knowledge of a given domain (De Boer et al, ).…”
Section: Related Workmentioning
confidence: 99%
“…However, there is another type of crowdsourcing that focuses on experts with domain knowledge in the subject-matter, also called nichesourcing. De Boer et al (2012) compares crowdsourcing to nichsourcing on three different parameters: 1 Crowdsourcing tasks are simple where nichesourcing tasks are knowledge-intensive. 2 Crowdsourcing products are determined by quantity where nichesourcing products are determined by quality.…”
Section: Crowdsourcingmentioning
confidence: 99%
“…2 Crowdsourcing products are determined by quantity where nichesourcing products are determined by quality. 3 The crowdsourcing resource pool is a large, anonymous and heterogeneous crowd, whereas the nichsourcing resource pool is a community of interest or practice of people who have a certain skill or expertise (de Boer et al, 2012).…”
Section: Crowdsourcingmentioning
confidence: 99%
“…[21] show that tagging flowers with their botanical names could not be performed by a crowd of lay people. In these cases, nichesourcing [10], or employing crowds of experts to perform the annotations, can combine the advantages of using a crowd with the domain knowledge of experts. We define a generalizable methodology for crowdsourcing that we can then use to run nichesourcing experiments.…”
Section: State Of the Artmentioning
confidence: 99%