Crowdsourcing platforms are commonly used for research in the humanities, social sciences and informatics, including the use of crowdworkers to annotate textual material or visuals. Utilizing two empirical studies, this article systematically assesses the potential of crowdcoding for less manifest contents of news texts, here focusing on political actor evaluations. Specifically, Study 1 compares the reliability and validity of crowdcoded data to that of manual content analyses; Study 2 proceeds to investigate the effects of material presentation, different types of coding instructions and answer option formats on data quality. We find that the performance of the crowd recommends crowdcoded data as a reliable and valid alternative to manually coded data, also for less manifest contents. While scale manipulations affected the results, minor modifications of the coding instructions or material presentation did not significantly influence data quality. In sum, crowdcoding appears a robust instrument to collect quantitative content data.
PurposeSocial media empower individuals to voice their opinions about issues that they perceive to be unacceptable. When many others add their opinions and large quantities of messages containing negative word-of-mouth suddenly spread online, an online firestorm occurs. By extending the situational theory of problem solving (Kim and Grunig, 2011) into the domain of online communication, this study aims to identify the drivers for participating in online firestorms.Design/methodology/approachWith reference to a fictitious online firestorm trigger (i.e. perceived moral misconduct) posted on Facebook, a qualitative pre-study and quantitative online survey were conducted. Based on the responses of 410 participants, an ordinary least squares regression was modeled to examine the factors of participating in the online firestorm. Later, structural equation modeling was applied to test the model and gauge its fit with the data.FindingsParticipants' involvement recognition, perception of being collective actors and approval of slacktivism behaviors positively predicted their participation in the online firestorm, whereas non-anonymity hampered it.Originality/valueThe study's findings not only contribute to the current understandings of online firestorms but are also valuable for developing theory and forms of professional crisis management. Moreover, they offer insights into the factors of online communication environments that encourage users to voice their opinions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.