2014
DOI: 10.1257/aer.104.5.431
|View full text |Cite
|
Sign up to set email alerts
|

Mechanism Design in Large Games: Incentives and Privacy

Abstract: We study the design of mechanisms satisfying a novel desideratum: privacy. This requires the mechanism not reveal 'much' about any agent's type to other agents. We propose the notion of joint differential privacy: a variant of differential privacy used in the privacy literature. We show by construction that mechanisms satisfying our desiderata exist when there are a large number of players, and any player's action affects any other's payoff by at most a small amount. Our results imply that in large economies, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
167
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
4
3
3

Relationship

4
6

Authors

Journals

citations
Cited by 83 publications
(171 citation statements)
references
References 29 publications
4
167
0
Order By: Relevance
“…that represents the expenditure by the advertiser required to generate a contact of intensity x . The complete-information profits from generating a contact of 12 Kearns et al (2014) study the design of mechanisms that satisfy the computer science criterion of differential privacy (Dwork 2006)-put simply, the notion of being able to distinguish one agent (a consumer) from another in a dataset of consumer characteristics with only a low probability. They show that mechanisms can be designed to satisfy a variant of this criterion when there are large numbers of agents, and any agent's action affects another agent's payoff by at most a small amount.…”
Section: Data Intermediariesmentioning
confidence: 99%
“…that represents the expenditure by the advertiser required to generate a contact of intensity x . The complete-information profits from generating a contact of 12 Kearns et al (2014) study the design of mechanisms that satisfy the computer science criterion of differential privacy (Dwork 2006)-put simply, the notion of being able to distinguish one agent (a consumer) from another in a dataset of consumer characteristics with only a low probability. They show that mechanisms can be designed to satisfy a variant of this criterion when there are large numbers of agents, and any agent's action affects another agent's payoff by at most a small amount.…”
Section: Data Intermediariesmentioning
confidence: 99%
“…We expect that this will be a useful point of view for other problems. In this direction, it is known how to privately compute equilibria in certain types of multi-player games [KPRU12]. Is there a useful way to use this multi-player generalization when solving problems in private data release, and what does it mean for privacy?…”
Section: Resultsmentioning
confidence: 99%
“…Our definition can be seen as an instantiation of this general framework. This is in contrast to other kinds of relaxations of differential privacy, which relax the worst-case assumptions on the prior beliefs of an attacker as in Bassily et al (12), or the worstcase collusion assumptions on collections of data analysts as in Kearns et al (13). Several works have also proposed assigning different differential privacy parameters to different individuals (see, e.g., ref.…”
Section: Significancementioning
confidence: 99%