2016
DOI: 10.14763/2016.1.401
|View full text |Cite
|
Sign up to set email alerts
|

Should we worry about filter bubbles?

Abstract: Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
122
2
8

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 274 publications
(151 citation statements)
references
References 21 publications
1
122
2
8
Order By: Relevance
“…Regarding the link between incidental consumption and users' ideological self-positioning, the fact that the most ideologised Internet users have less involuntary exposure to news is consistent with the hypothesis that they have greater control over their information sources (probably those with similar ideas). This could lead to an echo chamber effect or not (Colleoni, Rozza, and Arvidsson, 2014;Zuiderveen Borgesius et al 2016;Dubois & Blank, 2018). In any case, the reasonable assumption that reinforces this study-the greater the ideologisation, the lower the incidental consumption-must be confirmed in future works.…”
Section: Discussionmentioning
confidence: 54%
“…Regarding the link between incidental consumption and users' ideological self-positioning, the fact that the most ideologised Internet users have less involuntary exposure to news is consistent with the hypothesis that they have greater control over their information sources (probably those with similar ideas). This could lead to an echo chamber effect or not (Colleoni, Rozza, and Arvidsson, 2014;Zuiderveen Borgesius et al 2016;Dubois & Blank, 2018). In any case, the reasonable assumption that reinforces this study-the greater the ideologisation, the lower the incidental consumption-must be confirmed in future works.…”
Section: Discussionmentioning
confidence: 54%
“…In recent years, we have witnessed controversies, where companies such as Facebook have used personal data for conducting experiments on users without their knowledge, or 'manipulated' data-driven personalized communication and behavioural targeting in the online realm (Lanzing 2018). Thus personal privacy is another topic of debate (see Borgesius et al 2016;Floridi and Taddeo 2016;Lanzing 2018).…”
Section: Recommender Systems (Rs) Ethicsmentioning
confidence: 99%
“…In the current theoretical vacuum, popular conceptualizations of online information diffusion have sprung up around the idea of filter bubbles and/or echo chambers. Both the filter bubble concept (Pariser, 2011) and echo chambers (Jamieson & Cappella, 2008;Sunstein, 2009) center on the theoretical principle that algorithmic and self-selected filtering will drive the way information diffuses through online social networks (Zuiderveen Borgesius et al, 2016). Taken together these theoretically similar models assume people will end up in "a unique universe of information" (i.e., filter bubbles, Pariser, 2011, p. 8) consisting of only people with similar viewpoints (i.e., echo chambers, Sunstein, 2009).…”
Section: Filter Bubbles/echo Chambersmentioning
confidence: 99%
“…Empirical work attempting to measure filter bubbles in online media exposure have found little evidence for their existence (Bakshy, Messing, & Adamic, 2015;Goel, Mason, & Watts, 2010;Zuiderveen Borgesius et al, 2016). Garrett and colleagues (Garrett, 2009;Garrett et al, 2013) found people do not actively avoid information contradicting their views.…”
Section: Filter Bubbles/echo Chambersmentioning
confidence: 99%