Abstract:Social media is a new public sphere where people can, in principle, communicate with each other regardless of their status. However, social categories like gender may still bias online communication, replicating offline disparities. Examining over 94,000 Twitter users, we investigate the association between perceived gender and measures of online visibility: how often Twitter users are followed, assigned to lists, and retweeted. Our analysis shows that users perceived as female experience a 'glass ceiling,' si… Show more
“…In contrast, in our study we consider more properties of the nodes (such as the group they belong to and the visibility they obtain), in addition to characteristics such as node degree that have been previously studied. Nilizadeh et al (2016) were able to prove a glass ceiling effect in social networks. They investigated how perceived gender and online visibility can be linked, showing that users perceived as female experience a "glass ceiling" effect, similar to the one that makes it harder for women to reach higher positions in companies.…”
Section: Introductionmentioning
confidence: 94%
“…Homophily is a well-known phenomenon in network science and can be expressed as the tendency of people to connect to similar people, or in our case, of people in a group to connect to people in the same group. We measure homophily with respect to a random configuration, inspired by work analyzing dyadicity in signed networks (Park and Barabási 2007):…”
Section: Preliminariesmentioning
confidence: 99%
“…Similar findings were reported by Daly, Geyer, and Millen (2010), who conducted a large-scale user study on IBM's Social-Blue social network site. While these two works focus on the inequalities at the level of individual users, some authors have analysed a glass ceiling 1 effect for women in social networks (Nilizadeh et al 2016). For instance, Stoica, Riederer, and Chaintreau (2018) investigate the role of gender in organic and artificial growth of social networks, using a large social graph from Instagram, where women are the majority.…”
Evaluating (and mitigating) the potential negative effects of algorithms has become a central issue in computer science. While research on algorithmic bias in ranking systems has dealt with disparate exposure of products or individuals, less attention has been devoted to the analysis of the disparate exposure of subgroups of online users.In this paper, we investigate the visibility of minorities in people recommender systems in social networks. Specifically, we consider a bi-populated social network, i.e., a graph where the nodes belong to two different groups (majority and minority) and, by applying state-of-the-art people recommenders, we analyze how disparate visibility can be amplified or mitigated by different levels of homophily within each subgroup.We start our analysis on real-world social graphs, where the two subgroups are defined by sensitive demographic attributes such as gender or age. Our findings suggest that the way and the extent to which people recommenders can produce disparate visibility on the two subgroups, might depend in large part on the level of homophily within the subgroups. % To verify these findings, we move our analysis to synthetic datasets, where we can control characteristics of the input social graph, such as the size of the minority and the level of homophily. Our results show that homophily plays a key role in promoting or reducing visibility for different subgroups under various combinations of dataset characteristics and recommendation algorithms.
“…In contrast, in our study we consider more properties of the nodes (such as the group they belong to and the visibility they obtain), in addition to characteristics such as node degree that have been previously studied. Nilizadeh et al (2016) were able to prove a glass ceiling effect in social networks. They investigated how perceived gender and online visibility can be linked, showing that users perceived as female experience a "glass ceiling" effect, similar to the one that makes it harder for women to reach higher positions in companies.…”
Section: Introductionmentioning
confidence: 94%
“…Homophily is a well-known phenomenon in network science and can be expressed as the tendency of people to connect to similar people, or in our case, of people in a group to connect to people in the same group. We measure homophily with respect to a random configuration, inspired by work analyzing dyadicity in signed networks (Park and Barabási 2007):…”
Section: Preliminariesmentioning
confidence: 99%
“…Similar findings were reported by Daly, Geyer, and Millen (2010), who conducted a large-scale user study on IBM's Social-Blue social network site. While these two works focus on the inequalities at the level of individual users, some authors have analysed a glass ceiling 1 effect for women in social networks (Nilizadeh et al 2016). For instance, Stoica, Riederer, and Chaintreau (2018) investigate the role of gender in organic and artificial growth of social networks, using a large social graph from Instagram, where women are the majority.…”
Evaluating (and mitigating) the potential negative effects of algorithms has become a central issue in computer science. While research on algorithmic bias in ranking systems has dealt with disparate exposure of products or individuals, less attention has been devoted to the analysis of the disparate exposure of subgroups of online users.In this paper, we investigate the visibility of minorities in people recommender systems in social networks. Specifically, we consider a bi-populated social network, i.e., a graph where the nodes belong to two different groups (majority and minority) and, by applying state-of-the-art people recommenders, we analyze how disparate visibility can be amplified or mitigated by different levels of homophily within each subgroup.We start our analysis on real-world social graphs, where the two subgroups are defined by sensitive demographic attributes such as gender or age. Our findings suggest that the way and the extent to which people recommenders can produce disparate visibility on the two subgroups, might depend in large part on the level of homophily within the subgroups. % To verify these findings, we move our analysis to synthetic datasets, where we can control characteristics of the input social graph, such as the size of the minority and the level of homophily. Our results show that homophily plays a key role in promoting or reducing visibility for different subgroups under various combinations of dataset characteristics and recommendation algorithms.
“…(Fabbri et al 2020) also investigate the effect of homophily on visibility of minorities in people recommender systems, and find that homophily plays a key role in the disparate visibility of different groups. (Nilizadeh et al 2016;Stoica and Chaintreau 2018) show that biased recommendations can bring the glass ceiling effect, which affects female groups negatively.…”
Section: Related Workmentioning
confidence: 99%
“…To our best knowledge, our work presents the first comprehensive study of accuracy disparity in existing link prediction algorithms. Furthermore, while existing fairness-aware link prediction methods (Masrour et al 2020;Rahman et al 2019;Nilizadeh et al 2016;Lee et al 2019;Karimi et al 2018) are limited to specific link prediction algorithms, we design the first bias mitigation algorithm that is compatible with most of existing link prediction algorithms. Our contributions are summarized as follows.…”
Link prediction has been widely applied in social network analysis. Despite its importance, link prediction algorithms can be biased by disfavoring the links between individuals in particular demographic groups. In this paper, we study one particular type of bias, namely, the bias in predicting inter-group links (i.e., links across different demographic groups). First, we formalize the definition of bias in link prediction by providing quantitative measurements of accuracy disparity, which measures the difference in prediction accuracy of inter-group and intra-group links. Second, we unveil the existence of bias in six existing state-of-the-art link prediction algorithms through extensive empirical studies over real world datasets. Third, we identify the imbalanced density across intra-group and inter-group links in training graphs as one of the underlying causes of bias in link prediction. Based on the identified cause, fourth, we design a pre-processing bias mitigation method named FairLP to modify the training graph, aiming to balance the distribution of intra-group and inter-group links while preserving the network characteristics of the graph. FairLP is model-agnostic and thus is compatible with any existing link prediction algorithm. Our experimental results on real-world social network graphs demonstrate that FairLP achieves better trade-off between fairness and prediction accuracy than the existing fairness-enhancing link prediction methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.