Proceedings of the Twenty-Sixth Annual ACM-SIAM Symposium on Discrete Algorithms 2014
DOI: 10.1137/1.9781611973730.34
|View full text |Cite
|
Sign up to set email alerts
|

Approximate resilience, monotonicity, and the complexity of agnostic learning

Abstract: A function f is d-resilient if all its Fourier coefficients of degree at most d are zero, i.e. f is uncorrelated with all low-degree parities. We study the notion of approximate resilience of Boolean functions, where we say that f is α-approximately d-resilient if f is α-close to a [−1, 1]-valued d-resilient function in ℓ 1 distance. We show that approximate resilience essentially characterizes the complexity of agnostic learning of a concept class C over the uniform distribution. Roughly speaking, if all func… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 11 publications
(16 citation statements)
references
References 30 publications
0
16
0
Order By: Relevance
“…More closer to our work, [1,4,7] provide positive learning results using gradient-based algorithms, but do not show the benefit of a convolutional architecture for optimization performance, compared to a fully-connected architecture. The hardness of learning in the case of Boolean functions, using the degree of the target function, was discussed in the statistical queries literature, for instance in [6]. In terms of techniques, our construction is inspired by target functions proposed in [18,19], and based on ideas from the statistical queries literature (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…More closer to our work, [1,4,7] provide positive learning results using gradient-based algorithms, but do not show the benefit of a convolutional architecture for optimization performance, compared to a fully-connected architecture. The hardness of learning in the case of Boolean functions, using the degree of the target function, was discussed in the statistical queries literature, for instance in [6]. In terms of techniques, our construction is inspired by target functions proposed in [18,19], and based on ideas from the statistical queries literature (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…This is implicit in prior work [Kalai et al, 2008, Feldman, 2012 and we provide additional details in Section 5. Dachman-Soled et al [2015] recently showed that ℓ 1 approximation by polynomials is necessary and sufficient condition for agnostic learning over a product distribution (at least in the statistical query framework of Kearns [1998]). Our agnostic learning algorithm (Theorem 1.4) and lower bound for polynomial approximation (Theorem 1.1) demonstrate that this equivalence does not hold for non-product distributions.…”
Section: Corollary 14 There Is An Algorithm That Agnostically Learns ...mentioning
confidence: 99%
“…For example, if the unknown function is Boolean then learning with ℓ 1 error is equivalent to learning with Boolean disagreement error [KKMS08]. In fact, it is known that the complexity of agnostic learning over product distributions in the statistical query model is characterized by how well the Boolean functions can be approximated in ℓ 1 by low-degree polynomials [DSFTWW15]. Applications of learning algorithms for submodular functions to differentially-private data release require ℓ 1 error [GHRU11; CKKL12; FK14] as does learning of probabilistic concepts (which are concepts expressing the probability of an event) [KS94].…”
Section: Introductionmentioning
confidence: 99%