2019
DOI: 10.1007/s00521-019-04144-6
|View full text |Cite
|
Sign up to set email alerts
|

Assessing gender bias in machine translation: a case study with Google Translate

Abstract: Recently there has been a growing concern in academia, industrial research labs and the mainstream commercial media about the phenomenon dubbed as machine bias, where trained statistical models -unbeknownst to their creators -grow to reflect controversial societal asymmetries, such as gender or racial bias. A significant number of Artificial Intelligence tools have recently been suggested to be harmfully biased towards some minority, with reports of racist criminal behavior predictors, Apple's Iphone X failing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
109
0
8

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 206 publications
(156 citation statements)
references
References 18 publications
1
109
0
8
Order By: Relevance
“…Prates et al (2019) have argued that, given well-attested gender asymmetries in society, a 50:50 pronominal split in NMT outputs is unrealistic when sentences in gender-neutral languages (e.g., Finnish, Hungarian, Turkish) are translated into gendered languages (e.g., English, French, German). Nonetheless, they provide experimental evidence which suggests that Google Translate yields masculine defaults much more frequently than would be expected from demographic data alone (Prates et al 2019). Other researchers have developed techniques for mitigating biases in monolingual English NLP tools, with a handful of techniques applied to the more complex problem of inflected languages.…”
Section: Gender Bias In Nmt Systemsmentioning
confidence: 99%
“…Prates et al (2019) have argued that, given well-attested gender asymmetries in society, a 50:50 pronominal split in NMT outputs is unrealistic when sentences in gender-neutral languages (e.g., Finnish, Hungarian, Turkish) are translated into gendered languages (e.g., English, French, German). Nonetheless, they provide experimental evidence which suggests that Google Translate yields masculine defaults much more frequently than would be expected from demographic data alone (Prates et al 2019). Other researchers have developed techniques for mitigating biases in monolingual English NLP tools, with a handful of techniques applied to the more complex problem of inflected languages.…”
Section: Gender Bias In Nmt Systemsmentioning
confidence: 99%
“…Many languages such as Hungarian or Chinese are gender neutral. Yet Google Translate exhibited a strong bias toward male when translating certain sentences in English (such as "he is a doctor"; Prates et al 2020). Google has since successfully updated its system to provide both feminine and masculine translations (Johnson 2017).…”
Section: Responsible Service Analyticsmentioning
confidence: 99%
“…Cross-cultural work often uses translations of English lexicons (e.g., to evaluate scenic quality of landscape objects across countries) implicitly promoting the idea that meanings encoded in a lexicon are universal [4]. Such approaches are becoming more common with the availability of machine translation algorithms based on large-scale text analysis [5][6][7][8][9][10]. However, by assuming one-to-one translatability of words, these approaches do not capture cross-linguistic variability in word meaning which has been shown to exist for a variety of domains-even when comparing closely related languages [11,12].…”
Section: Introductionmentioning
confidence: 99%
“…However, by assuming one-to-one translatability of words, these approaches do not capture cross-linguistic variability in word meaning which has been shown to exist for a variety of domains-even when comparing closely related languages [11,12]. Moreover, machine translation algorithms have also been shown to introduce new biases present in the original corpora, for example assigning gender to nouns in stereotypical ways to languages where gendering is not typical [6,13]. In the case of landscape, this could mean that specific cultural ways of thinking about landscape are imposed upon other cultures.…”
Section: Introductionmentioning
confidence: 99%