2022
DOI: 10.1103/physrevresearch.4.043143
|View full text |Cite
|
Sign up to set email alerts
|

Decomposing neural networks as mappings of correlation functions

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…A separate work, [36], presents close-to-Gaussian NN processes including stationary Bayesian posteriors in the joint limit of large width and large data set, using 1/N as an expansion parameter. Moreover, the authors of [37] explore a correspondence between learning dynamics in the continuous time limit and early Universe cosmology, and [38] analyzes connected correlation functions propagating through NN.…”
Section: Learningmentioning
confidence: 99%
“…A separate work, [36], presents close-to-Gaussian NN processes including stationary Bayesian posteriors in the joint limit of large width and large data set, using 1/N as an expansion parameter. Moreover, the authors of [37] explore a correspondence between learning dynamics in the continuous time limit and early Universe cosmology, and [38] analyzes connected correlation functions propagating through NN.…”
Section: Learningmentioning
confidence: 99%
“…Our new approach emphasizes the role of information and information geometry in renormalization, which we now understand broadly as a mechanism for identifying equivalence classes of probability distributions that are indistinguishable as predictive models at a level of precision fixed by the amount of accessible data. This viewpoint allows us to draw a very sharp analogy between renormalization and aspects of data compression [36,[55][56][57][58], data generation [59][60][61], data classification [40,[62][63][64][65][66][67], dimensional reduction and model selection [33,34,68] commonly studied in data science and machine learning. We present this approach to renormalization through its relationship with Bayesian inference to highlight its information theoretic origin.…”
Section: Bayesian Renormalization and Information Geometrymentioning
confidence: 99%
“…The probability mapping problem in deep feedforward networks can be represented as an iterative transformation of distributions, which is crucial for the recognition and representation of data information. Strengthening the correlation between information processing can provide an interpretable perspective for classification [15]. Deep convolutional neural networks have been widely used in many fields such as natural language processing and computer vision.…”
Section: Related Workmentioning
confidence: 99%