Crowdsourcing is becoming a norm among Internet users as it can be a convenient and cost-saving way of obtaining information or input into a task by enlisting a crowd of people. However, data obtained through a crowdsourcing platform may not be reliable and may lead to misinformation or misleading conclusions. Therefore, the need to evaluate and measure the trustworthiness of crowdsourced data is of utmost importance. In this paper, existing methods of evaluating the trustworthiness of data gathered from a crowdsourcing platform is studied. The aim is to investigate the different mechanism and measurements of trust and reliability of crowdsourced data. As implementation of evaluating trustworthiness is domain dependent, we selected the relevant mechanisms and measurements to be considered in our proposed speech emotion annotation in a crowdsourcing platform. After further studies, we decided to adapt and integrate selected mechanisms and measurements from the incentive, quality of participant and system control methods to be implemented in our proposed work in the future.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.