The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of
algorithmic news recommenders
(ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express
voice
. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for
algorithmic recommender personae
as a way for people to take more control over the algorithms that curate people's news provision.
This article is part of a theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.
Since Edward Snowden’s revelations of US and UK surveillance programs, privacy advocates, progressive security engineers, and policy makers have been seeking to win majority support for countering surveillance. The problem is framed as the replacement of targeted surveillance with mass surveillance programs, and the solutions put forward are predominantly technical and involve the use of encryption – or ‘crypto’ – as a defense mechanism. The counter-surveillance movement is timely and deserves widespread support. However, as this article will argue and illustrate, raising the specter of an Orwellian system of mass surveillance, shifting the discussion to the technical domain, and couching that shift in economic terms undermine a political reading that would attend to the racial, gendered, classed, and colonial aspects of the surveillance programs. Our question is as follows: how can this specific discursive framing of counter-surveillance be re-politicized and broadened to enable a wider societal debate informed by the experiences of those subjected to targeted surveillance and associated state violence?
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.