Sentiment is central to many studies of communication science, from negativity and polarization in political communication to analyzing product reviews and social media comments in other sub-fields. This study provides an exhaustive comparison of sentiment analysis methods, using a validation set of Dutch economic headlines to compare the performance of manual annotation, crowd coding, numerous dictionaries and machine learning using both traditional and deep learning algorithms. The three main conclusions of this article are that: (1) The best performance is still attained with trained human or crowd coding; (2) None of the used dictionaries come close to acceptable levels of validity; and (3) machine learning, especially deep learning, substantially outperforms dictionary-based methods but falls short of human performance. From these findings, we stress the importance of always validating automatic text analysis methods before usage. Moreover, we provide a recommended step-bystep approach for (automated) text analysis projects to ensure both efficiency and validity.
Do parties change their platform in anticipation of electoral losses? Or do parties respond to experienced losses at the previous election? These questions relate to two mechanisms to align public opinion with party platforms: (1) rational anticipation, and (2) electoral performance. While extant work empirically tested, and found support for, the latter mechanism, the effect of rational anticipation has not been put to an empirical test yet. We contribute to the literature on party platform change by theorizing and assessing how party performance motivates parties to change their platform in-between elections. We built a new and unique dataset of >20,000 press releases issued by 15 Dutch national political parties that were in parliament between 1997 and 2014. Utilizing automated text analysis (topic modeling) to measure parties' platform change, we show that electoral defeat motivates party platform change in-between elections. In line with existing findings, we demonstrate that parties are backwardlooking. AbstractDo parties change their platform in anticipation of electoral losses? Or do parties respond to experienced losses at the previous election? These questions relate to two mechanisms to align public opinion with party platforms: (1) rational anticipation, and (2) electoral performance. While extant work empirically tested, and found support for, the latter mechanism, the effect of rational anticipation has not been put to an empirical test yet. In this paper, we contribute to the literature on party platform change by theorizing and assessing how party performance (electoral results, standing in the polls and government participation) motivates parties to change their platform in-between elections. We built a new and unique dataset of >20,000 press releases issued by 15 Dutch national political parties that were in parliament between 1997 and 2014. These data are particularly apt to identify parties' strategies in-between elections and therefore allow us to test both mechanisms; something that is impossible by examining changes in parties' election manifestos -the typical data such studies use. Utilizing automated text analysis (topic modeling) to measure parties' platform change, we show that electoral defeat motivates party platform change in-between elections. In line with existing findings, we demonstrate that parties are backwardlooking. Still, we find this effect only for opposition parties and, interestingly, no indication that this effect weakens over time. Moreover, our findings demonstrate that electoral prospects fail to influence party platform change, disconfirming the rational anticipation mechanism. Additionally, the findings provide important insights on the role of government participation.
Analyzing political text can answer many pressing questions in political science, from understanding political ideology to mapping the effects of censorship in authoritarian states. This makes the study of political text and speech an important part of the political science methodological toolbox. The confluence of increasing availability of large digital text collections, plentiful computational power, and methodological innovations has led to many researchers adopting techniques of automatic text analysis for coding and analyzing textual data. In what is sometimes termed the “text as data” approach, texts are converted to a numerical representation, and various techniques such as dictionary analysis, automatic scaling, topic modeling, and machine learning are used to find patterns in and test hypotheses on these data. These methods all make certain assumptions and need to be validated to assess their fitness for any particular task and domain.
Political compromises are essential to any form of government, but they are particularly central in coalitions, where different parties have to balance each others’ priorities. While a willingness to compromise was once seen as a welcome sign of political maturity, many voters now see it as selling out for the sake of power, an issue Dr Mariken van der Velden is exploring in her research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.