Journalists are increasingly investigating and reporting on problematic online content such as misinformation, disinformation, and conspiracy theories, leading to the creation of a new misinformation beat. The process of collecting, analyzing, and reporting on this kind of data is complex and nuanced. It is especially challenging as online actors attempt to undermine their work. Through in-depth interviews with twelve journalists, we explore how they investigate and report on online misinformation and disinformation. Our findings reveal some of the unique challenges of reporting on this beat, as well as the ways in which reporters overcome those challenges. We highlight and discuss how journalistic values could be better embedded into the design of tools to support their work, the power dynamics between social media companies and journalists, and the promise of collaborations as a way to support and educate journalists on this beat. This work provides contextual knowledge to researchers looking to better support investigative journalists - on the misinformation beat and beyond - as their work becomes more entangled in sociotechnical systems.
Perspective is a publicly available, machine learning API that can score text for toxicity. It is available for use in online platforms and communities to limit toxicity and promote civil dialogue. In this work, we adopt a human-centered approach to evaluating Perspective by investigating if human ratings of toxicity align with Perspective’s toxicity scores. We also test its transferability by making this comparison for comments from three platforms that have different commenting styles and moderation strategies: News Websites, YouTube, and Twitter. Apart from toxicity, the main attribute, we collect participant ratings for three additional attributes: respectfulness, formality, and presence of stereotypes. While disrespect is part of how Perspective defines toxicity, formality and presence of stereotypes were included in the study to explore if they could be hidden/latent attributes that affect toxicity scores from Perspective. We analyzed how participant ratings for these additional attributes vary with respect to Perspective’s toxicity score for comments from each platform. We find that for high toxicity scores, Perspective strongly aligns with participant ratings of toxicity and disrespectfulness across all three platforms, providing weak evidence of its transferability. However, our evaluation also surfaced formality and presence of stereotypes as latent attributes that are unrecognized parts of Perspective’s scores. We discuss how and why this evaluation is ‘human-centered’, the importance of conducting such evaluations, and implications of these results for content moderation in social platforms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.