2023
DOI: 10.1088/2632-2153/acb488
|View full text |Cite
|
Sign up to set email alerts
|

Categorical representation learning and RG flow operators for algorithmic classifiers

Abstract: Following the earlier formalism of the categorical representation learning, we discuss the construction of the ``RG-flow-based categorifier''. Borrowing ideas from the theory of renormalization group flows (RG) in quantum field theory, holographic duality, and hyperbolic geometry and combining them with neural ODE techniques, we construct a new algorithmic natural language processing (NLP) architecture, called the RG-flow categorifier or for short the RG categorifier, which is capable of data classification an… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…• Deep learning and RG: several studies [55][56][57][58][59] have drawn parallels between deep neural networks and RG. Notably, they recognize that generation is the inverse process of renormalization [6,11]. Therefore, generative models can be used to implement data-driven RG.…”
Section: Summary and Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…• Deep learning and RG: several studies [55][56][57][58][59] have drawn parallels between deep neural networks and RG. Notably, they recognize that generation is the inverse process of renormalization [6,11]. Therefore, generative models can be used to implement data-driven RG.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Despite their effectiveness in extracting features of stable phases, they lack controlled accuracy in predicting universal properties of phase transitions, as they did not learn the RG equation or the RG monotone. • RG flow-based generative modeling: techniques such as neural-RG [2,6] and RG-Flow [10,11] embed RG transformations in multi-level flow-based generative models [60][61][62], applying deep learning methods to learn optimal RG transformations from model Hamiltonians by minimizing free energy. These methods are based on the invertible RG framework, which designs the local RG transformation as a bijective (invertible) deterministic map from spin configurations to relevant and irrelevant features.…”
Section: Summary and Discussionmentioning
confidence: 99%
See 2 more Smart Citations