2021
DOI: 10.48550/arxiv.2106.07148
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Sample Complexity of Learning under Invariance and Geometric Stability

Abstract: Many supervised learning problems involve high-dimensional data such as images, text, or graphs. In order to make efficient use of data, it is often useful to leverage certain geometric priors in the problem at hand, such as invariance to translations, permutation subgroups, or stability to small deformations. We study the sample complexity of learning problems where the target function presents such invariance and stability properties, by considering spherical harmonic decompositions of such functions on the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 13 publications
(31 reference statements)
0
3
0
Order By: Relevance
“…Examples include fluid dynamics [20,24,29,51], molecular dynamics [1,43,58], quantum mechanics [25,26,48], etc. Theoretical studies have also been conducted to show the benefit of preserving symmetry during learning [3,12,21,30].…”
Section: Background and Related Work 21 Neural Network And Symmetriesmentioning
confidence: 99%
“…Examples include fluid dynamics [20,24,29,51], molecular dynamics [1,43,58], quantum mechanics [25,26,48], etc. Theoretical studies have also been conducted to show the benefit of preserving symmetry during learning [3,12,21,30].…”
Section: Background and Related Work 21 Neural Network And Symmetriesmentioning
confidence: 99%
“…Such ideas can be generalized in terms of symmetries in the world [93]. Machine learning has developed many techniques to incorporate known invariances to the learning process [94][95][96][97], as well as mathematically quantifying how much one can gain by imposing them [98,99]. In fact, in many cases we may want to think about constraints themselves as something to be learned [100,101], a process that would unfold over evolutionary timescales for NIs.…”
Section: Constraintsmentioning
confidence: 99%
“…Their work focuses on the benefits of group invariance in reducing the sample size. [10] considered general kernels that incorporate group invariance and derived a generalization bound based on counting the number of eigenfunctions of the kernel. [28,27] proposed a convolutional kernel network (CKN) that involves layers of patch extraction, convolution, and pooling.…”
Section: Related Workmentioning
confidence: 99%