Companion Proceedings of the 2019 World Wide Web Conference 2019
DOI: 10.1145/3308560.3317584
|View full text |Cite
|
Sign up to set email alerts
|

Achieving Differential Privacy and Fairness in Logistic Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
52
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 52 publications
(58 citation statements)
references
References 14 publications
1
52
0
3
Order By: Relevance
“…Also, computer science research suggests that including specifically sensitive data such as gender, age, race, and religion would help create fair a gorithms (nondiscriminatory on the grounds of these data) (Galdon Clavell et al 2020), which is also in conflict with protecting and respecting the privacy of consumers. How to meet both privacy and fairness requirements simultaneously in algorithms is an emerging field in computer science (Xu, Yuan, and Wu 2019). However, this question also needs to be addressed by advertising scholars.…”
Section: Tension Between the Advertising Industry And Consumer Advocamentioning
confidence: 99%
“…Also, computer science research suggests that including specifically sensitive data such as gender, age, race, and religion would help create fair a gorithms (nondiscriminatory on the grounds of these data) (Galdon Clavell et al 2020), which is also in conflict with protecting and respecting the privacy of consumers. How to meet both privacy and fairness requirements simultaneously in algorithms is an emerging field in computer science (Xu, Yuan, and Wu 2019). However, this question also needs to be addressed by advertising scholars.…”
Section: Tension Between the Advertising Industry And Consumer Advocamentioning
confidence: 99%
“…Recently, researchers have attempted to adopt differential privacy to simultaneously achieve both fairness and privacy preservation [52], [53]. This research is motivated by settings where models are required to be non-discriminatory in terms of certain attributes, but these attributes may be sensitive and so must be protected while training the model [54].…”
Section: Applying Differential Privacy To Improve Fairnessmentioning
confidence: 99%
“…Many works [33][34][35][36][37][38][39][40][41] have tried to address fairness and privacy guarantees together. Kilbertus et al [34] is one of the first proposals that addressed the need for combining fairness requirements with privacy guarantees.…”
Section: Related Workmentioning
confidence: 99%
“…They provide a different approach based on Differential Privacy, where privacy is guaranteed through an injectable amount of noise able to mask the presence of a protected individual in a particular dataset. In this direction, other works [36][37][38][39] aimed to learn fair and differentially private ML models. Cummings et al [36], while showing that it is impossible to achieve both differential privacy and exact fairness without non-trivial accuracy, provides a Probably Approximately Correct [107] learner that is differentially private and approximately (with high probability) fair.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation