Proceedings of the 4th Conference on Innovations in Theoretical Computer Science 2013
DOI: 10.1145/2422436.2422450
|View full text |Cite
|
Sign up to set email alerts
|

Characterizing the sample complexity of private learners

Abstract: In 2008, Kasiviswanathan et al. defined private learning as a combination of PAC learning and differential privacy [17]. Informally, a private learner is applied to a collection of labeled individual information and outputs a hypothesis while preserving the privacy of each individual. Kasiviswanathan et al. gave a generic construction of private learners for (finite) concept classes, with sample complexity logarithmic in the size of the concept class. This sample complexity is higher than what is needed for no… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
119
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
3
3
2

Relationship

3
5

Authors

Journals

citations
Cited by 66 publications
(124 citation statements)
references
References 24 publications
(60 reference statements)
5
119
0
Order By: Relevance
“…There are also several rich lines of work attempting to give tight instance-specific characterizations of the sample complexities of various differentially private computations, most notably linear query release [HT10, BDKT12, NTZ16, Nik15, KN16] and PAC and agnostic learning [KLN + 08, BNS13,FX14]. The problems considered in these works are arguably more complex than the hypothesis testing problems we consider here, the characterizations are considerably looser, and are only optimal up to polynomial factors.…”
Section: Related Workmentioning
confidence: 99%
“…There are also several rich lines of work attempting to give tight instance-specific characterizations of the sample complexities of various differentially private computations, most notably linear query release [HT10, BDKT12, NTZ16, Nik15, KN16] and PAC and agnostic learning [KLN + 08, BNS13,FX14]. The problems considered in these works are arguably more complex than the hypothesis testing problems we consider here, the characterizations are considerably looser, and are only optimal up to polynomial factors.…”
Section: Related Workmentioning
confidence: 99%
“…They also showed that there exists an improper ε-private learner for this class, with sample complexity O α,β ,ε (1). An alternative private learner for this class was presented in [4].…”
Section: Concentration Boundsmentioning
confidence: 99%
“…There exists an efficient (α, β , ε, δ , m)-PPAC proper-learner for RECTANGLE n d , where This should be contrasted with θ α,β (n), which is the non-private sample complexity for this class (as the VC-dimension of RECTANGLE n d is 2n), and with θ α,β ,ε (nd) which is the pure-private sample complexity for this class. 4 …”
Section: Axis-aligned Rectangles In High Dimensionmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, it will be interesting to find a natural invariant that characterizes which classes can be learned privately (like the way the VC dimension characterizes PAC learning [Blumer et al, 1989, Vapnik andChervonenkis, 1971]). Such parameters exist in the case of pure differentially private learning; these include the one-way communication complexity characterization by Feldman and Xiao [2015] and the representation dimension by Beimel et al [2013]. However, no such parameter is known for approximate differentially private learning.…”
mentioning
confidence: 99%