2018
DOI: 10.1016/j.dss.2018.03.010
|View full text |Cite
|
Sign up to set email alerts
|

Toward a real-time and budget-aware task package allocation in spatial crowdsourcing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
30
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(30 citation statements)
references
References 55 publications
0
30
0
Order By: Relevance
“…In both phases, we measured the performance of our proposed Bayesian Network-based (BN) task matching against three other approaches. The approaches are based on three baseline algorithms which are Greedy algorithm [3], [10], [22], [23], kNN, time-weighted kNN algorithm [21] and Genetic Algorithm [12]. In the current experiment, both the synthetic and real-world datasets were used to simulate a realistic SC scenario so as to demonstrate the feasibility of the proposed approach.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In both phases, we measured the performance of our proposed Bayesian Network-based (BN) task matching against three other approaches. The approaches are based on three baseline algorithms which are Greedy algorithm [3], [10], [22], [23], kNN, time-weighted kNN algorithm [21] and Genetic Algorithm [12]. In the current experiment, both the synthetic and real-world datasets were used to simulate a realistic SC scenario so as to demonstrate the feasibility of the proposed approach.…”
Section: Methodsmentioning
confidence: 99%
“…The aim of the framework was to maximize the number of assigned tasks under budget constraints. [22] designed a real-time, and budget-aware task allocation mechanism to maximize the number of assigned tasks, and to improve the expected quality of the completed task under limited budget constraints. The mechanism also considers the distance of each worker from the tasks and the worker's performance in previously assigned tasks.…”
Section: Related Workmentioning
confidence: 99%
“…20 In crowdsourcing contests, the participation history factors that include participation recency and frequency, winning recency and frequency, and tenure and last performance of the competitors are also derived in a study. 21 One of the authors suggests "personalized task recommendation" approach, which aims the matching of the worker interest with the appropriate task and thus makes the 10 Task and participant matching 0.5 0.5 0 0.5 1.5 Nassar and Karray 3 Overview of the crowdsourcing process 0.5 1 0 1 2.5 Javadi Khasraghi and Aghaie 21 Crowdsourcing contests 1 1 0.5 1 3.5 Zheng et al 58 Task design in crowdsourcing 0.5 0.5 0.5 0.5 2 Sales Fonteles et al 59 Trajectory recommendation of tasks 1 0.5 0.5 0.5 2.5 Burnap et al 30 Identifying experts in the crowd for evaluation of engineering designs 0.5 0.5 0.5 0.5 2 Ali-Hassan and Allam 60 Comparing crowdsourcing initiatives toward a typology development 0 0 0 1 1 Wang et al 61 Mobile crowdsourcing framework, challenges, and solutions 0 0 0 1 1 Ghezzi et al 62 Crowdsourcing review 0 0.5 0 1 1.5 Xintong et al 63 Brief survey of crowdsourcing for data mining 0 0 0 1 1 Pournajaf et al 38 Crowd sensing task assignment 0 0 0 1 1 Tarasov et al 64 Worker reliability in crowdsourcing 0 0 0 1 1 Baba et al 39 Improper task detection in crowdsourcing 0.5 0.5 0.5 1 2.5 Geiger and Schader 22 Ptask recommendation in crowdsourcing 0.5 0.5 0 1 2 Hosseini et al 16 Crowdsourcing: A taxonomy and systematic mapping study 0.5 0.5 0 0.5 1.5 Ye and Kankanhalli 65 Organizational task crowdsourcing 0.5 0 0 1 1.5 Ellero et al 66 Real-time crowdsourcing 0 0 0.5 1 1.5 Mao et al 1 Crowdsourcing survey in software engineering 0.5 0.5 0.5 1 2.5 Morschheuser et al 67 Gamified crowdsourcing conceptualization, literature review, and future agenda 0 0 0 1 1 Safran and Che 29 Real-time recommendation algorithms for crowdsourcing systems 1 1 1 1 4 Younas et al 68 Optimal task assignment 0.5 0 1 0.5 2 Harman and Azzam 69 Crowdsourcing criteria and standards 0.5 0 0 0.5 1 Moayedikia et al 31 Task assignment in crowdsourcing platforms 0.5 0 1 0 1.5 Wu et al 70 Task allocation in crowdsourcing 0.5 0.5 1 0 2 Sarı et al…”
Section: What Are the Task Assignment Framework/models Available For Effective Crowdsourcing?mentioning
confidence: 99%
“…They use the spatial attributes of the problem, namely spatial distribution and user travel costs, to propose the minimum location entropy priority and close distance priority strategies to address these challenges. Wu et al [25] proposed a real-time, budget-aware spatial crowdsourcing task allocation (RB-TPSC) model, with the goal of increasing task allocation rates and maximizing the expected outcome quality of the staff under a limited budget. The proposed RB-TPSC model can automatically make decisions on task allocation.…”
Section: Related Workmentioning
confidence: 99%