2021
DOI: 10.48550/arxiv.2106.02346
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Provably Strict Generalisation Benefit for Invariance in Kernel Methods

Abstract: It is a commonly held belief that enforcing invariance improves generalisation. Although this approach enjoys widespread popularity, it is only very recently that a rigorous theoretical demonstration of this benefit has been established. In this work we build on the function space perspective of Elesedy and Zaidi [8] to derive a strictly non-zero generalisation benefit of incorporating invariance in kernel ridge regression when the target is invariant to the action of a compact group. We study invariance enfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Recent results show how to compute explicitly the gains in terms of generalization gaps and sample complexity of imposing group invariances and equivariances in machine learning. The results from Mei et al (2021) and Bietti et al (2021) hold for finite groups, whereas the results from Elesedy and Zaidi (2021) and Elesedy (2021) hold for general compact groups. Specifically, Elesedy and Zaidi (2021) shows that if we are aiming to learn a target invariant function f * from samples from an invariant distribution µ, then for any estimator f , the projection of f onto the space of invariant functions has smaller expected error.…”
Section: The Geometry Of Invariant Function Spacesmentioning
confidence: 94%
See 2 more Smart Citations
“…Recent results show how to compute explicitly the gains in terms of generalization gaps and sample complexity of imposing group invariances and equivariances in machine learning. The results from Mei et al (2021) and Bietti et al (2021) hold for finite groups, whereas the results from Elesedy and Zaidi (2021) and Elesedy (2021) hold for general compact groups. Specifically, Elesedy and Zaidi (2021) shows that if we are aiming to learn a target invariant function f * from samples from an invariant distribution µ, then for any estimator f , the projection of f onto the space of invariant functions has smaller expected error.…”
Section: The Geometry Of Invariant Function Spacesmentioning
confidence: 94%
“…A recent line of work develops mathematical theory to quantify the benefits of imposing symmetries and group equivariance in regression problems. The work of Elesedy and Zaidi (2021) provides a computation of the excess risk in the context of linear regression, then extended to kernel regressions in Elesedy (2021). The works of Bietti et al (2021) and Mei et al (2021) focus on quantifying the sample complexity gain for regressions with (finite) group-invariant kernels.…”
Section: Our Contributionsmentioning
confidence: 99%
See 1 more Smart Citation