2019
DOI: 10.1088/1367-2630/ab31ef
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning by unitary tensor network of hierarchical tree structure

Abstract: The resemblance between the methods used in quantum-many body physics and in machine learning has drawn considerable attention. In particular, tensor networks (TNs) and deep learning architectures bear striking similarities to the extent that TNs can be used for machine learning. Previous results used one-dimensional TNs in image recognition, showing limited scalability and flexibilities. In this work, we train two-dimensional hierarchical TNs to solve image recognition problems, using a training algorithm der… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
98
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 107 publications
(99 citation statements)
references
References 48 publications
0
98
0
Order By: Relevance
“…By looking only at this small corner of the Hilbert space, one lowers the computational cost to a polynomial dependence on sys-tem size. This language has been explored for generative and classification tasks using hierarchical representations such as matrix product states (MPS), tree tensor networks (TTN), and the multi-scale entanglement renormalization ansatz (MERA) [17,43,44].…”
Section: B the Variational Circuit U θ θ θmentioning
confidence: 99%
See 1 more Smart Citation
“…By looking only at this small corner of the Hilbert space, one lowers the computational cost to a polynomial dependence on sys-tem size. This language has been explored for generative and classification tasks using hierarchical representations such as matrix product states (MPS), tree tensor networks (TTN), and the multi-scale entanglement renormalization ansatz (MERA) [17,43,44].…”
Section: B the Variational Circuit U θ θ θmentioning
confidence: 99%
“…One of their simplest models is efficiently trained classically and then deployed on the IBM Q5 Tenerife quantum computer with significant resilience to noise. Stoudenmire et al [43] train a 2D TTN to perform pairwise classification of the MNIST image data. Although a fully classical experiment, they use quantum fidelity to measure the inherent difficulty to distinguish two classes, and entanglement entropy as quantifying the amount of information about one part of an image that can be gained by knowing the rest.…”
Section: A Supervised Learningmentioning
confidence: 99%
“…[41]). As shown in the previous works [15,25,26,[28][29][30][31][32], TN models (including MPS) possess remarkable generalization power that is competitive to neural networks. Notably, TN models surpass neural networks as they possess high interpretability and allow us to implement quantum processes.…”
Section: Tensor-network Compressed Sensingmentioning
confidence: 72%
“…Here, we take | in the form of MPS [40]. Note that TNCS is a general scheme, where one may also choose other TN forms to represent | , such as tree TN or MERA [26,31,32], or simply a quantum state without a specific entanglement structure.…”
Section: Tensor-network Compressed Sensingmentioning
confidence: 99%
See 1 more Smart Citation