Proceedings of the 9th International Conference on Learning Analytics &Amp; Knowledge 2019
DOI: 10.1145/3303772.3303791
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating the Fairness of Predictive Student Models Through Slicing Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
93
0
4

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 111 publications
(97 citation statements)
references
References 20 publications
0
93
0
4
Order By: Relevance
“…Generally, separation requires that an algorithm's performance, not just its predictions, be fair across groups, or stated another way, that correct and incorrect predictions are distributed equally in relation to the groups under consideration. This criterion has generated several attempts at precise formulation, from equal opportunity/equalized odds (Hardt et al, 2016) to slicing analysis (Gardner et al, 2019) to predictive parity (Chouldechova, 2017).…”
Section: Formal Fairness and Its Application (In A Messy World)mentioning
confidence: 99%
“…Generally, separation requires that an algorithm's performance, not just its predictions, be fair across groups, or stated another way, that correct and incorrect predictions are distributed equally in relation to the groups under consideration. This criterion has generated several attempts at precise formulation, from equal opportunity/equalized odds (Hardt et al, 2016) to slicing analysis (Gardner et al, 2019) to predictive parity (Chouldechova, 2017).…”
Section: Formal Fairness and Its Application (In A Messy World)mentioning
confidence: 99%
“…For example, the widely used demographic parity is defined such that the probability of yielding a positive prediction should be the same across protected groups (e.g., grouped with genders or races). However, demographic parity is not ideal in that (1) it only ensures aggregated fairness while ignoring individual fairness and (2) it does not reflect the true tendency of a disadvantaged group [14,17,19]. For example, female students might have the tendency to enroll in various courses to find the best fit and thus tend to drop out at the start of courses after final course schedules have been planned [21].…”
Section: Equalized Oddsmentioning
confidence: 99%
“…However, limited attention has been paid to the fairness of prediction with ML in educational settings [14]. Studies have shown that ML models can be biased by demographic factors such as genders [4,37].…”
Section: Introductionmentioning
confidence: 99%
“…It would probably not be an overstatement to say that fairness in AI is quickly becoming its own sub-field, with a new annual ACM conference on Fairness, Accountability, and Transparency having been inaugurated in 2018 17 and relevant research appearing at many impactful publication venues, such as Science (Caliskan et al, 2017), NIPS (Pleiss et al, 2017;Kim et al, 2018), ICML (Kearns et al, 2018), ACL (Hovy and Spruit, 2016;Sun et al, 2019;Sap et al, 2019), KDD (Speicher et al, 2018), AAAI (Zhang and Bareinboim, 2018), and others (Dwork et al, 2012;Hajian and Domingo-Ferrer, 2013). There is also recent work that examines fairness and ethical considerations when using AI in an education (Mayfield et al, 2019;Gardner et al, 2019).…”
Section: Increased Attention To Fairnessmentioning
confidence: 99%