2018
DOI: 10.1088/1742-5468/aae025
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning algorithms based on generalized Gibbs ensembles

Abstract: Machine learning algorithms often take inspiration from the established results and knowledge from statistical physics. A prototypical example is the Boltzmann machine algorithm for supervised learning, which utilizes knowledge of classical thermal partition functions and the Boltzmann distribution. Recently, a quantum version of the Boltzmann machine was introduced by Amin, et. al., however, noncommutativity of quantum operators renders the training process by minimizing a cost function inefficient. Recent ad… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 39 publications
(87 reference statements)
0
2
0
Order By: Relevance
“…Puškarov and Cubero [219] argued that the generalized Gibbs ensembles [220,221] could be leveraged as a basis for QBMs. In doing so, the gradient information can be effectively obtained due to the commutativity of the conserved charge in the integrable Hamiltonian by learning the optimal effective temperatures.…”
Section: Variantsmentioning
confidence: 99%
“…Puškarov and Cubero [219] argued that the generalized Gibbs ensembles [220,221] could be leveraged as a basis for QBMs. In doing so, the gradient information can be effectively obtained due to the commutativity of the conserved charge in the integrable Hamiltonian by learning the optimal effective temperatures.…”
Section: Variantsmentioning
confidence: 99%
“…This is where ML can be of immense assistance [7][8][9][10][11]. On the other hand, physicists have also been contributing to ML, drawing inspiration from the theories and techniques developed for computational physics, as well as providing insights into the foundation of AI and ML [12][13][14][15]. One such insight manifests as we study deep learning side by side with the renormalization group (Fig.…”
Section: Introductionmentioning
confidence: 99%