In late 2019, the gravest pandemic in a century began spreading across the world. A state of uncertainty related to what has become known as SARS-CoV-2 has since fueled conspiracy narratives on social media about the origin, transmission and medical treatment of and vaccination against the resulting disease, COVID-19. Using social media intelligence to monitor and understand the proliferation of conspiracy narratives is one way to analyze the distribution of misinformation on the pandemic. We analyzed more than 9.5M German language tweets about COVID-19. The results show that only about 0.6% of all those tweets deal with conspiracy theory narratives. We also found that the political orientation of users correlates with the volume of content users contribute to the dissemination of conspiracy narratives, implying that partisan communicators have a higher motivation to take part in conspiratorial discussions on Twitter. Finally, we showed that contrary to other studies, automated accounts do not significantly influence the spread of misinformation in the German speaking Twitter sphere. They only represent about 1.31% of all conspiracy-related activities in our database.
Governments have begun to employ technological systems that use massive amounts of data and artificial intelligence (AI) in the domains of law enforcement, public health, or social welfare. In some areas, shifts in public opinion increasingly favor technology-aided public decision-making. This development presents an opportunity to explore novel approaches to how technology could be used to reinvigorate democratic governance and how the public perceives such changes. The study therefore posits a hypothetical AI voting system that mediates political decision-making between citizens and the state. We conducted a four-country online survey (N=6043) in Greece, Singapore, Switzerland, and the US to find out what factors affect the public's acceptance of such a system. The data show that Singaporeans are most likely and Greeks least likely to accept the system. Considerations of the technology's utility have a large effect on acceptance rates across cultures whereas attitudes towards political norms and political performance have partial effects.
Several scholars have demonstrated a positive link between political polarization and the resistance to COVID-19 prevention measures. At the same time, political polarization has also been associated with the spread of misinformation. This study investigates the theoretical linkages between polarization and misinformation and measures the flow of misinformation about COVID-19 in the comment sections of four popular YouTube channels for over 16 months using big data sources and methods. For the analysis, we downloaded about 3.5M English language YouTube comments posted in response to videos about the pandemic. We then classified the comments into one of the two following categories by applying a supervised Natural Language Processing classifier: (1)fake: comments that contain claims and speculation which are verifiably not true; and (2)legitimate:comments that do not fall into the fake category. The results show that the level of misinformation in YouTube comment sections has increased during the pandemic, that fake comments attract statistically more likes, and that the ratio of fake comments increased by 0.4% per month. These findings suggest that once introduced into an online discussion, misinformation potentially leads to an escalating spiral of misinformation comments, which undermines public policy. Overall, the results signal alarming pandemic-related misinformation and, potentially, rising levels of affective polarization. We place these results in context and point out the limitations of our approach.
In late 2019, the gravest pandemic in a century began spreading across the world. A state of uncertainty related to what has become known as SARS-CoV-2 has since fueled conspiracy narratives on social media about the origin, transmission and medical treatment of and vaccination against the resulting disease, COVID-19. Using social media intelligence to monitor and understand the proliferation of conspiracy narratives is one way to analyze the distribution of misinformation on the pandemic. We analyzed more than 9.5M German language tweets about COVID-19. The results show that only about 0.6% of all those tweets deal with conspiracy theory narratives. We also found that the political orientation of users correlates with the volume of content users contribute to the dissemination of conspiracy narratives, implying that partisan communicators have a higher motivation to take part in conspiratorial discussions on Twitter. Finally, we showed that contrary to other studies, automated accounts do not significantly influence the spread of misinformation in the German speaking Twitter sphere. They only represent about 1.31% of all conspiracy-related activities in our database.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.