Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.1177/14614448221080480
|View full text |Cite
|
Sign up to set email alerts
|

“Pose”: Examining moments of “digital” dark sousveillance on TikTok

Abstract: Over the past year, many young creators who use the Chinese-owned social networking platform TikTok have claimed that its underlying algorithm surveils and suppresses the reach of content by Black, brown, fat, queer, and disabled creators. However, despite these algorithmic biases, these marginalized creators have continued to find new and ingenious ways to not only create but also successfully share anti-racist, anti-misogynistic, LGBTQIA+supportive, and body-positive content on the platform. Using this tensi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…In a similar vein, additional hypothesis driven research should determine the degree to which the dynamics of the ForYou page ever deviate from those of the base page and the degree to which discourses that resonate in the creator community of TikTok as being so heavy handed are in fact continuing to be true. In this sense, this study provides important computational support for arguments made by those including Peterson-Salahuddin (2022) particularly through the ways that it disrupts the defense of corporations to rely on a seemingly neutral or unknowable algorithm.…”
Section: Tiktok As Televisionmentioning
confidence: 66%
See 1 more Smart Citation
“…In a similar vein, additional hypothesis driven research should determine the degree to which the dynamics of the ForYou page ever deviate from those of the base page and the degree to which discourses that resonate in the creator community of TikTok as being so heavy handed are in fact continuing to be true. In this sense, this study provides important computational support for arguments made by those including Peterson-Salahuddin (2022) particularly through the ways that it disrupts the defense of corporations to rely on a seemingly neutral or unknowable algorithm.…”
Section: Tiktok As Televisionmentioning
confidence: 66%
“…In the context of our study of TikTok, we are offering a very early examination of the form. Bhandari and Bimo (2020) and Peterson-Salahuddin (2022) have noted that the feeling of the algorithm and the ways that certain kinds of user are privileged weighs heavy on TikTok users, which is a powerful central theme in interview work. Schellewald (2022) argued that the ways of talk about algorithms on TikTok are an important form of sensemaking about the platform, which call for studies of these central mechanisms both as they are and as users experience them.…”
Section: Why Flow?mentioning
confidence: 99%
“…Specifically, PYD has been criticized for not adequately addressing the role of race and ethnicity ( Williams and Deutsch, 2016 ) and structural oppression ( Ginwright and James, 2002 ) in shaping access to opportunities for youth of color. Black youth face racism-related threats in multiple contexts across the life course ( Jones et al, 2020 ; Seaton, 2020 ), and these threats manifest on social media in ways that are both interpersonal (e.g., online racial discrimination; Tynes et al, 2020 ) and structural (algorithm bias; Angwin et al, 2017 ; Peterson-Salahuddin, 2022 ). For example, algorithm bias on social media platforms has been shown to suppress the visibility of content from Black users ( Peterson-Salahuddin, 2022 ).…”
Section: Theoretical Foundationsmentioning
confidence: 99%
“…Black youth face racism-related threats in multiple contexts across the life course ( Jones et al, 2020 ; Seaton, 2020 ), and these threats manifest on social media in ways that are both interpersonal (e.g., online racial discrimination; Tynes et al, 2020 ) and structural (algorithm bias; Angwin et al, 2017 ; Peterson-Salahuddin, 2022 ). For example, algorithm bias on social media platforms has been shown to suppress the visibility of content from Black users ( Peterson-Salahuddin, 2022 ). This threatens Black youths’ ability to engage positively with race-and STEM-related content on social media.…”
Section: Theoretical Foundationsmentioning
confidence: 99%
See 1 more Smart Citation