2016
DOI: 10.7287/peerj.preprints.1733v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gender differences and bias in open source: Pull request acceptance of women versus men

Abstract: Biases against women in the workplace have been documented in a variety of studies. This paper presents the largest study to date on gender bias, where we compare acceptance rates of contributions from men versus women in an open source software community. Surprisingly, our results show that women's contributions tend to be accepted more often than men's. However, women's acceptance rates are higher only when they are not identifiable as women. Our results suggest that although women on GitHub may be more comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
19
1

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(22 citation statements)
references
References 2 publications
1
19
1
Order By: Relevance
“…Social aspects of the contributor played a role in recent studies (Terrell et al, 2017;Ford et al, 2019, e.g. ) and we can confirm that there are at least a few consciously perceived differences from the developer's point of view as to whether the contributor is human or bot.…”
Section: Resultssupporting
confidence: 78%
See 1 more Smart Citation
“…Social aspects of the contributor played a role in recent studies (Terrell et al, 2017;Ford et al, 2019, e.g. ) and we can confirm that there are at least a few consciously perceived differences from the developer's point of view as to whether the contributor is human or bot.…”
Section: Resultssupporting
confidence: 78%
“…However, pull requests in that study were proposed manually and only a single time after prior consultation with the project maintainers. Furthermore, we know that contributions are not only evaluated on their content, but also on the social characteristics of the contributor (Terrell et al, 2017;Ford et al, 2019). In the case of contributing bots, identifying them as bots can be sufficient to observe a negative bias compared to contributions from humans (Murgia et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Recently, female software developers reported not only sexual harassment [23] but also various technical biases. For example, women are often assigned menial tasks [25] and code commits from females are less likely to be accepted [65] than males. Marwick et al claims that in the current software industry, females often feel like they "do not belong" and often struggle with an "imposter syndrome" 1 [49].…”
Section: Introductionmentioning
confidence: 99%
“…For this research, we mined the code review repositories of six popular open source software (OSS) projects and identified all the developers that have committed at least five code changes for those projects. Based on the approach adopted in a recent study [65], we developed a semi-automated methodology followed by a manual validation using social networks (i.e., LinkedIn, Google Plus, Facebook, Github, and Twitter) to identify the genders of the 'Non-casual developer' 2 . We applied SentiSE, a customized and state-of-the-art sentiment analysis tool for the SE domain to identify the sentiment polarity of each code review comment.…”
Section: Introductionmentioning
confidence: 99%
“…Indeed, research indicates subjective evaluations of fit often lead individuals to conclude that women are not well positioned to fill roles in historically male-dominated fields (Boring, Ottoboni, and Stark 2016;Cassese and Holman 2017;Foschi 2000;Moss-Racusin et al 2012). By contrast, when evaluations focus on objective criteria, the contributions and qualifications of women are more likely to be recognized and valued (Goldin and Rouse 2000;Terrell et al 2017).…”
mentioning
confidence: 99%