To explore the effects of different incentives on crowdsourcing participation and submission quality, we conduct a field experiment on Taskcn, a large Chinese crowdsourcing site using all-pay auction mechanisms. In our study, we systematically vary the size of the reward, as well as the presence of a soft reserve, or early high-quality submission. We find that a higher reward induces significantly more submissions and submissions of higher quality. In comparison, we find that high-quality users are significantly less likely to enter tasks where a high quality solution has already been submitted, resulting in lower overall quality in subsequent submissions in such soft reserve treatments.
Massive open online courses (MOOCs) have developed rapidly in recent years, and have attracted millions of online users. However, a central challenge is the extremely high dropout rate — recent reports show that the completion rate in MOOCs is below 5% (Onah, Sinclair, and Boyatt 2014; Kizilcec, Piech, and Schneider 2013; Seaton et al. 2014).What are the major factors that cause the users to drop out?What are the major motivations for the users to study in MOOCs? In this paper, employing a dataset from XuetangX1, one of the largest MOOCs in China, we conduct a systematical study for the dropout problem in MOOCs. We found that the users’ learning behavior can be clustered into several distinct categories. Our statistics also reveal high correlation between dropouts of different courses and strong influence between friends’ dropout behaviors. Based on the gained insights, we propose a Context-aware Feature Interaction Network (CFIN) to model and to predict users’ dropout behavior. CFIN utilizes context-smoothing technique to smooth feature values with different context, and use attention mechanism to combine user and course information into the modeling framework. Experiments on two large datasets show that the proposed method achieves better performance than several state-of-the-art methods. The proposed method model has been deployed on a real system to help improve user retention.
To explore the effects of different incentives on crowdsourcing participation and submission quality, we conduct a field experiment on Taskcn, a large Chinese crowdsourcing site using all-pay auction mechanisms.In our study, we systematically vary the size of the reward, as well as the presence of a soft reserve, or early high-quality submission. We find that a higher reward induces significantly more submissions and submissions of higher quality. In comparison, we find that high-quality users are significantly less likely to enter tasks where a high quality solution has already been submitted, resulting in lower overall quality in subsequent submissions in such soft reserve treatments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.