2022
DOI: 10.48550/arxiv.2205.05625
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantum Self-Attention Neural Networks for Text Classification

Abstract: An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in Quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the qua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 37 publications
0
7
0
Order By: Relevance
“…The hybrid attention heads we used are almost identical to the architecture implemented in [31], "Quantum Self-Attention Neural Networks for Text Classification" by Li et al In order to replace the self-attention mechanism of a classical vision transformer in Equation ( 1), we use the following procedure:…”
Section: Hybrid Encoder Layermentioning
confidence: 99%
“…The hybrid attention heads we used are almost identical to the architecture implemented in [31], "Quantum Self-Attention Neural Networks for Text Classification" by Li et al In order to replace the self-attention mechanism of a classical vision transformer in Equation ( 1), we use the following procedure:…”
Section: Hybrid Encoder Layermentioning
confidence: 99%
“…This approach is based on previous work [13], which proposed the same idea for the original transformer architecture for text [14]. Other works have explored other possible quantum adaptations of the original transformer [13,15], as well as adaptations of the vision transformer [16,17] and the graph transformer [18]. Our work differs from [16] in the architecture that we propose, which explores the use of other quantum ansatzes.…”
Section: Introductionmentioning
confidence: 99%
“…Unfortunately, the above two approaches only involve certain physical concepts in quantum mechanics without providing specific quantum circuits. A recent meaningful effort was contributed by the Baidu group, where a Gaussian projection-based QSAN using VQA [27] to build Parametric Quantum Circuits (PQC) [28] on Noisy Intermediate-Scale Quantum (NISQ) [39] devices was applied to text classification [40]. While this work is significant, it is still worth discussing whether quantum computers can take on more tasks since self-attention scores need to be calculated on classical computers in order to obtain the ultimate output.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast to Ref. [25,26,40], QSAN is potentially fully deployed and realized on quantum devices with fewer measurements and a beneficial byproduct called QBSASM. Whereas, the essential motivation for proposing this QSAN is to explore whether young quantum computers can have quantum characteristic attention and can depict the distribution of outputs in a quantum language, not to replace SAM or to beat all the schemes in the Ref.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation