2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE) 2021
DOI: 10.1109/icse43902.2021.00129
|View full text |Cite
|
Sign up to set email alerts
|

"Ignorance and Prejudice" in Software Fairness

Abstract: Machine learning software can be unfair when making human-related decisions, having prejudices over certain groups of people. Existing work primarily focuses on proposing fairness metrics and presenting fairness improvement approaches. It remains unclear how key aspect of any machine learning system, such as feature set and training data, affect fairness. This paper presents results from a comprehensive study that addresses this problem. We find that enlarging the feature set plays a significant role in fairne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(32 citation statements)
references
References 32 publications
0
28
0
Order By: Relevance
“…For example, a decisionmaking system should automatically measure discrimination (e.g., to ensure fairness in software crowdsourcing platforms). Fairness is considered important in software by other authors [37][38][39][40].…”
Section: Catalogue Consolidationmentioning
confidence: 99%
“…For example, a decisionmaking system should automatically measure discrimination (e.g., to ensure fairness in software crowdsourcing platforms). Fairness is considered important in software by other authors [37][38][39][40].…”
Section: Catalogue Consolidationmentioning
confidence: 99%
“…For example, for demographic parity, researchers calculate the favorable rate among different demographic groups and detect fairness violations by comparing these rates. If the rate difference, called Statistical Parity Difference (SPD) in the software fairness literature [35], [38], [48], [50], [118], is beyond a threshold, the software under test is identified as containing fairness bugs.…”
Section: Statistical Measurements As Test Oraclesmentioning
confidence: 99%
“…Feature bias occurs when some features in the training data are highly related to the sensitive attribute, and these correlated features can thus become the root cause of software unfairness [50]. Zhang et al [118] explored how the feature set influences ML software fairness. The results showed that the feature set plays a significant role in fairness, which motivates the fairness work about testing data features.…”
Section: Data Testingmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2018, Brun and Meliou [9] stated that ensuring the fairness of software systems (software fairness) is a critical software engineering problem to be tackled from multiple directions, and since then it has gained more and more attention from software engineering research [10], [11], [12], [13], [14], [15].…”
Section: Introductionmentioning
confidence: 99%