2016
DOI: 10.1371/journal.pone.0159641
|View full text |Cite
|
Sign up to set email alerts
|

Users Polarization on Facebook and Youtube

Abstract: Users online tend to select information that support and adhere their beliefs, and to form polarized groups sharing the same view—e.g. echo chambers. Algorithms for content promotion may favour this phenomenon, by accounting for users preferences and thus limiting the exposure to unsolicited contents. To shade light on this question, we perform a comparative study on how same contents (videos) are consumed on different online social media—i.e. Facebook and YouTube—over a sample of 12M of users. Our findings sh… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

4
123
0
3

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 186 publications
(140 citation statements)
references
References 20 publications
4
123
0
3
Order By: Relevance
“…Indeed, massive digital misinformation has been designated as a major technological and geopolitical risk by the 2013 report of the World Economic Forum [3]. A substantial number of studies have recently investigated the phenomena of misinformation in online social networks such as Facebook [4][5][6][7][8][9][10] Twitter [10][11][12][13], YouTube [14] or Wikipedia [15]. These investigations, as well as theoretical modeling [16,17], suggest that confirmation bias [18] and social influence results in the emergence, in online social networks, of user communities that share similar beliefs about specific topics, i.e.…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, massive digital misinformation has been designated as a major technological and geopolitical risk by the 2013 report of the World Economic Forum [3]. A substantial number of studies have recently investigated the phenomena of misinformation in online social networks such as Facebook [4][5][6][7][8][9][10] Twitter [10][11][12][13], YouTube [14] or Wikipedia [15]. These investigations, as well as theoretical modeling [16,17], suggest that confirmation bias [18] and social influence results in the emergence, in online social networks, of user communities that share similar beliefs about specific topics, i.e.…”
Section: Introductionmentioning
confidence: 99%
“…There is a vast literature on how the Facebook News Feed works, including topological aspects related to cascading structures and growth [22], [33], [9] and its effects on the creation of echo chambers and polarization [6], [7]. In this paper, we study the News Feed filtering impact on the dissemination of information, measuring and modeling the visibility of publishers and posts in the News Feed.…”
Section: Related Workmentioning
confidence: 99%
“…The literature on the News Feed algorithm is vast, including measurements [9], [6], [7], [8], models [2], [12] and user awareness surveys [16]. Nonetheless, most of the prior work that quantifies the effect of OSNs on information diffusion [5], [4] relies on measurements obtained through restrictive non-disclosure agreements that are not made publicly available to other researchers and practitioners.…”
Section: Introductionmentioning
confidence: 99%
“…Contrarily, perhaps, to intuitions related to the popularization of socalled "filter bubbles", several recent studies appear to show that algorithmic suggestions do not necessarily contribute to restrict the horizon of users. Be it in terms of interaction or information consumption, users do not seem to be proposed less diversity content in regard to what would happen in the absence of recommendation [1][2][3][4][5][6] or using distinct recommendation approaches [7,8], except for what stems from explicit personalization (i.e. explicitly chosen [9], or self-selected [10], by users [11]).…”
Section: Introductionmentioning
confidence: 99%