Algorithms in online platforms interact with users' identities in different ways. However, little is known about how users understand the interplay between identity and algorithmic processes on these platforms, and if and how such understandings shape their behavior on these platforms in return. Through semi-structured interviews with 15 US-based TikTok users, we detail users' algorithmic folk theories of the For You Page algorithm in relation to two inter-connected identity types: person and social identity. Participants identified potential harms that can accompany algorithms' tailoring content to their person identities. Further, they believed the algorithm actively suppresses content related to marginalized social identities based on race and ethnicity, body size and physical appearance, ability status, class status, LGBTQ identity, and political and social justice group affiliation. We propose a new algorithmic folk theory of social feeds-The Identity Strainer Theory-to describe when users believe an algorithm filters out and suppresses certain social identities. In developing this theory, we introduce the concept of algorithmic privilege as held by users positioned to benefit from algorithms on the basis of their identities. We further propose the concept of algorithmic representational harm to refer to the harm users experience when they lack algorithmic privilege and are subjected to algorithmic symbolic annihilation. Additionally, we describe how participants changed their behaviors to shape their algorithmic identities to align with how they understood themselves, as well as to resist the suppression of marginalized social identities and lack of algorithmic privilege via individual actions, collective actions, and altering their performances. We theorize our findings to detail the ways the platform's algorithm and its users co-produce knowledge of identity on the platform. We argue the relationship between users' algorithmic folk theories and identity are consequential for social media platforms, as it impacts users' experiences, behaviors, sense of belonging, and perceived ability to be seen, heard, and feel valued by others as mediated through algorithmic systems.
Each new emergent technology within the larger landscape of information and communication technologies (ICTs) innovation is met with a combination of awe and paranoia, imagining the advances as simultaneously being liberating and threatening user privacy. While it is easy to hold both concepts as true within most ICT-based contexts, what is overlooked are the inherent presumptions of who uses ICTs and how such practices define social perceptions of any ICT's particular value. TikTok is currently one of the most discussed ICTs. Given a popular imagining of TikTok as having an algorithmic that is hyper-adaptive to any user, the theorizing and studying how one might use the platform in cross-group information practices remains chronically underexplored. Further, given the application's abundance of youth users, such practices become discursively marked as dangerous and inherently negative. Such presumptions belie the actual practices of this particular subset of users. For example, the linguistic practices of queer-identified youth popularized tagging practices microblogging sites like Tumblr (Oakley, 2016) and femme/female teenagers utilize platforms such as YouTube to explore complex narratives around self-esteem related to topics ranging from eating disorders to transitioning (Holmes, 2017; Tortajada et al., 2021). Important in these studies is the inherent counter-discursive practices wielded by femme/feminine and gender diverse youth whose occupation of the online spaces affords them an ability to explore topics in affirmative and communally protective ways. This proposed panel builds on this understanding by highlighting the work of two scholars studying TikTok while simultaneously framing their work in a more extensive consideration of scholarship on the intersections of gender identity, information practices, and ICTs. SIG convenor Dr. Travis Wagner will moderate the panel and contextualize the emergence of TikTok within a more extensive history of ICTs and identify how gender-based identity making in such spaces is imagined to date. This setup will then introduce the research of two doctoral students studying the information practices of gender-diverse youth within ICTs like TikTok. First, Dan Delmonaco will discuss their research on healthcare provider use of TikTok to create and share sexual and reproductive health information amongst gender diverse youth populations. Second, Valerie Lookingbill will explore sociocultural factors shaping users' interactions with stigmatized mental health information on social media platforms. She will specifically examine how femme/female youth navigate TikTok to discuss non-suicidal self-injury in response to algorithmic exclusion. Cognizant that the exploration of counter discursive uses of TikTok cannot ignore the more significant issues of privacy, data ethics, and the intersectional components of identity as shaped by technology, the panel will conclude with a response to the presentations by Dr. Miriam Sweeney. Her work explores identity-making across the landscape of data and technology and centers the role of interface design in identity-making. Sweeney will respond to the themes between Delmonaco and Lookingbill's work and offer insight into broader trends on the horizon of ICT-based information practices research and how it might inform LIS pedagogy and questions of inclusive ICT design.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.