2021
DOI: 10.1007/s00779-021-01604-6
|View full text |Cite
|
Sign up to set email alerts
|

Can the crowd judge truthfulness? A longitudinal study on recent misinformation about COVID-19

Abstract: Recently, the misinformation problem has been addressed with a crowdsourcing-based approach: to assess the truthfulness of a statement, instead of relying on a few experts, a crowd of non-expert is exploited. We study whether crowdsourcing is an effective and reliable method to assess truthfulness during a pandemic, targeting statements related to COVID-19, thus addressing (mis)information that is both related to a sensitive and personal issue and very recent as compared to when the judgment is done. In our ex… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 60 publications
0
4
0
Order By: Relevance
“…Other studies have examined the evolution of misinformation over time, tracking the emergence, spread, and persistence of false narratives. These investigations shed light on the lifespan of misinformation within social networks and provide a deeper understanding of the dynamics of information propagation during public health crises [25].…”
Section: Solutionmentioning
confidence: 96%
“…Other studies have examined the evolution of misinformation over time, tracking the emergence, spread, and persistence of false narratives. These investigations shed light on the lifespan of misinformation within social networks and provide a deeper understanding of the dynamics of information propagation during public health crises [25].…”
Section: Solutionmentioning
confidence: 96%
“…Roitero et al (2020) looked at how crowd workers compare with expert fact-checkers highlighting a good level of agreement between the two. Later, Roitero et al (2021) looked at the longitudinal dimension of crowdsourced truthfulness assessment observing a consistency in the generated labels. La Barbera et al (2020) observed a political bias in crowd-generated truthfulness labels.…”
Section: Related Work Crowdsourcing For Misinformation Detectionmentioning
confidence: 99%
“…Researchers and platforms have tried several approaches to ban or otherwise label COVID-19 dis/misinformation content on social media, including fact-checking (Roitero et al, 2021) and de-bunking (Wang et al, 2021) strategies. As mentioned above, many of these approaches are automated.…”
Section: Covid-19 Misinformation Il Interventionsmentioning
confidence: 99%