2022
DOI: 10.1088/2632-2153/ac8393
|View full text |Cite
|
Sign up to set email alerts
|

RG-Flow: a hierarchical and explainable flow model based on renormalization group and sparse prior

Abstract: Flow-based generative models have become an important class of unsupervised learning approaches. In this work, we incorporate the key ideas of renormalization group (RG) and sparse prior distribution to design a hierarchical flow-based generative model, RG-Flow, which can separate information at different scales of images and extract disentangled representations at each scale. We demonstrate our method on synthetic multi-scale image datasets and the CelebA dataset, showing that the disentangled representations… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…This demonstration serves as a starting point to explore more possibilities in future studies. As we have discussed, optimization-free [22,[35][36][37][38][39][40][41][42][43][44] and optimization-dependent RGs [23][24][25][26][27][28][29][30][31][32][33][34] have their own advantages and disadvantages in solving physics questions. While we have shown a scheme towards ideal optimization-free RG designs in the present work, there is no reason to ignore the possibility of combining the advantages of both kinds of RGs.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This demonstration serves as a starting point to explore more possibilities in future studies. As we have discussed, optimization-free [22,[35][36][37][38][39][40][41][42][43][44] and optimization-dependent RGs [23][24][25][26][27][28][29][30][31][32][33][34] have their own advantages and disadvantages in solving physics questions. While we have shown a scheme towards ideal optimization-free RG designs in the present work, there is no reason to ignore the possibility of combining the advantages of both kinds of RGs.…”
Section: Discussionmentioning
confidence: 99%
“…Considering these challenges, one would find existing computational realizations of RGs imperfect. Among these works, although Monte-Carlo-based [23][24][25][26][27][28], discriminative-model-based [29,30], and generative-model-based RGs [31][32][33][34] are effective in estimating coarse-grained configurations and system parameters, they demand the a priori knowledge about target system as inputs, relay on specialized optimization before application, and lack the generalization capacity to new systems unless extra optimization is supported. Compared with these machine-learning-aid frameworks, the optimization-free designs of phenomenological RGs [22,[35][36][37][38][39][40][41][42][43][44] are more favorable in reducing the reliance on a priori knowledge and training.…”
mentioning
confidence: 99%
“…Despite their effectiveness in extracting features of stable phases, they lack controlled accuracy in predicting universal properties of phase transitions, as they did not learn the RG equation or the RG monotone. • RG flow-based generative modeling: techniques such as neural-RG [2,6] and RG-Flow [10,11] embed RG transformations in multi-level flow-based generative models [60][61][62], applying deep learning methods to learn optimal RG transformations from model Hamiltonians by minimizing free energy. These methods are based on the invertible RG framework, which designs the local RG transformation as a bijective (invertible) deterministic map from spin configurations to relevant and irrelevant features.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Prior research has demonstrated that neural networks can learn to perform hierarchical feature extraction at the configuration level [1][2][3][4][5][6][7][8][9][10][11]. However, a more fascinating aspect of RG is its ability to quantitatively analyze the flow of physics theory in the parameter space at the model level [12,13].…”
Section: Introductionmentioning
confidence: 99%
“…Properly utilizing scale separation is key element of calculations in many physical domains and in the context of lattice field theory has lead to the development of many useful algorithms such as multigrid methods [12,13]. Previous machine-learning based work has explored the use of scale separation in scalar field theories [14][15][16][17], 2D 𝑈 (1) gauge theory [18], and in the context of linear preconditioners [19,20], but not in the context of non-Abelian gauge theories.…”
Section: Introductionmentioning
confidence: 99%