2017
DOI: 10.48550/arxiv.1708.00006
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tensor Networks in a Nutshell

Jacob Biamonte,
Ville Bergholm

Abstract: Tensor network methods are taking a central role in modern quantum physics and beyond. They can provide an efficient approximation to certain classes of quantum states, and the associated graphical language makes it easy to describe and pictorially reason about quantum circuits, channels, protocols, open systems and more. Our goal is to explain tensor networks and some associated methods as quickly and as painlessly as possible. Beginning with the key definitions, the graphical tensor network language is prese… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
109
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 85 publications
(109 citation statements)
references
References 53 publications
0
109
0
Order By: Relevance
“…( 21), the interaction pattern does not satisfy the nearestneighbor form because the Hamiltonian includes long-range interactions. To evolve the state using the star Hamiltonian, one can use swap gates [15,36] to exchange positions of adjacent sites. A swap gate exchanges the two physical legs of two adjacent tensors.…”
Section: Time Evolution Of Matrix Product Statesmentioning
confidence: 99%
“…( 21), the interaction pattern does not satisfy the nearestneighbor form because the Hamiltonian includes long-range interactions. To evolve the state using the star Hamiltonian, one can use swap gates [15,36] to exchange positions of adjacent sites. A swap gate exchanges the two physical legs of two adjacent tensors.…”
Section: Time Evolution Of Matrix Product Statesmentioning
confidence: 99%
“…Spiders are linear operations which can have any number of input or output wires. There are two varieties: Z-spiders depicted as green dots and X-spiders depicted as red dots 4 , each of which can be labelled by a phase α ∈ R:…”
Section: The Zxh-calculusmentioning
confidence: 99%
“…In the 1960s, Yutsis et al developed a graphical calculus for the quantum theory of angular momentum [2], while in the 1970s, Penrose was advocating the use of diagrams to deal with tensors [3]. Modern applications of these tensor networks, include Matrix Product States (MPS) and their higher-dimensional generalisations such as Projected Entangled Pair States (PEPS), which can be used in condensed matter physics to deal with ground states of quantum spin models [4]. A common theme here is that these diagrams offer the ability to reason about complex properties of some physical system without immediate recourse to more primitive calculation, usually in the form of matrix calculation or an extension thereof.…”
Section: Introductionmentioning
confidence: 99%
“…We define a tensor train of length l to be a tensor network consisting of l vertices arranged in a line, each of which has 3 adjacent edges (see Figure 2). [2,4]. This format can be converted to the format introduced here simply with a contraction of the single edges, thus removing the vertices at each end.…”
Section: Tensor Networkmentioning
confidence: 99%
“…Tensor networks provide many other ways of decomposing tensors. They originate from quantum physics and are used to depict the structure of steady states of Hamiltonians of quantum systems [2,4]. Many types of tensor network decompositions, like tensor trains [11], also known as matrix product states [4], are used in machine learning to decompose data tensors in meaningful ways.…”
Section: Introductionmentioning
confidence: 99%