2022
DOI: 10.48550/arxiv.2210.08566
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Theory for Equivariant Quantum Neural Networks

Abstract: Most currently used quantum neural network architectures have little-to-no inductive biases, leading to trainability and generalization issues. Inspired by a similar problem, recent breakthroughs in classical machine learning address this crux by creating models encoding the symmetries of the learning task. This is materialized through the usage of equivariant neural networks whose action commutes with that of the symmetry. In this work, we import these ideas to the quantum realm by presenting a general theore… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 109 publications
1
14
0
Order By: Relevance
“…[61,62]), and quantum (see e.g. [63][64][65][66]). Equivariant ANNs exploit symmetry in data so that the output samples also preserve this symmetry, such as is the case of E(3) equivariant ANNs (which transform data under translations and rotations, with applications including image processing).…”
Section: Discussionmentioning
confidence: 99%
“…[61,62]), and quantum (see e.g. [63][64][65][66]). Equivariant ANNs exploit symmetry in data so that the output samples also preserve this symmetry, such as is the case of E(3) equivariant ANNs (which transform data under translations and rotations, with applications including image processing).…”
Section: Discussionmentioning
confidence: 99%
“…In the previous section, we have introduced TWI-QRNNs by imposing time warping-invariance in terms of the density matrices produced by the equivalent dissipative QNN. Since the probabilities (18) have the same form as the expectations (5) defining the outputs of QRNNs, the same reasoning applies to SQRNNs ρ AB t . Based on this observation, we define TWI-SQRNNs in a manner analogous to TWI-QRNNs.…”
Section: Twi-sqrnnsmentioning
confidence: 93%
“…Recently, the quantum machine learning (QML) community has also focused on introducing geometric priors into quantum models [11,12,13]. For example, quantum graph neural networks preserve permutations symmetries, making them suitable for learning quantum tasks with a graph structure [14,15,16,17]; whilst quantum convolutional neural networks [18] preserve translations symmetry. As their classical counterparts, symmetry-preserving quantum models have the potential not only of reducing sample complexity, but also to mitigate quantum computing-specific issues such as barren plateaus [19,20].…”
Section: Related Workmentioning
confidence: 99%
“…This issue has given rise to the field of quantum machine learning (QML) [1,2]. QML has seen the proposal of parameterized quantum models, such as quantum neural networks [3][4][5][6], that could efficiently process quantum data. Variational QML, which involves classically training a parameterized quantum model, is indeed a leading candidate for implementing QML in the near term.…”
Section: Introductionmentioning
confidence: 99%