2022
DOI: 10.48550/arxiv.2207.07563
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

QSAN: A Near-term Achievable Quantum Self-Attention Network

Abstract: A Quantum Self-Attention Network (QSAN) that can be achieved on near-term quantum devices is investigated. First, the theoretical basis of QSAN, a linearized and reversible Quantum Self-Attention Mechanism (QSAM) including Quantum Logic Similarity (QLS) and Quantum Bit Self-Attention Score Matrix (QBSASM), is explored to solve the storage problem of Self-Attention Mechanism (SAM) due to quadratic complexity. More importantly, QLS uses logical operations instead of inner product operations to enable QSAN to be … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 37 publications
0
1
0
Order By: Relevance
“…It can predict the bounding box coordinates and the corresponding confidence scores with one single network. There are numerous YOLO versions dedicated to operating on underwater images for detection of various objects such as starfish, shrimp, crab, scallop, and waterweed Zhao et al, 2022). Among these models, the recently proposed model, YOLOfish was designed for fish detection and is reported to be performing close to YOLOv4 model on two different public datasets (Muksit et al, 2022).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…It can predict the bounding box coordinates and the corresponding confidence scores with one single network. There are numerous YOLO versions dedicated to operating on underwater images for detection of various objects such as starfish, shrimp, crab, scallop, and waterweed Zhao et al, 2022). Among these models, the recently proposed model, YOLOfish was designed for fish detection and is reported to be performing close to YOLOv4 model on two different public datasets (Muksit et al, 2022).…”
Section: Related Workmentioning
confidence: 99%
“…As deep learning techniques have developed, researchers have began to design neural networks for predicting weather elements (e.g., rainfall), which can well mine complex and intrinsic correlations, such as artificial neural networks (ANN) Feng et al (2016), convolutional neural networks (CNN) ; Ye et al (2021b); Patil et al (2021), long short-term memory networks (LSTM) Broni-Bedaiko et al (2019), convolutional long short-term memory networks (ConvLSTM) Mu et al (2019); He et al (2019); Gupta et al (2022), CNN-LSTM Zhou and Zhang (2022), graph neural networks (GNN) , recurrent neural network (RNN) Zhao et al (2022), transformer Ye et al (2021a) etc. Feng et al Feng et al (2016) propose two methods to predict the existence of ENSO, and the time evolution of ENSO scalar features, which provided a new prediction direction for predicting the occurrence for ENSO events.…”
Section: Publisher's Notementioning
confidence: 99%
See 1 more Smart Citation
“…Its success has led to its application in classifying plankton datasets (Baek et al, 2022;Dagtekin and Dethlefs, 2022;Kyathanahally et al, 2022;Shao et al, 2022). In the future, it will be intriguing to develop quantum visual transformer models based on the quantum selfattention mechanism (Li et al, 2022;Shi et al, 2023;Zhao et al, 2022), and explore their potential for phytoplankton classification.…”
Section: Introductionmentioning
confidence: 99%
“…Similarly, Sipio et al [35] proposed a quantum-enhanced transformer model by replacing the linear transformations in the classical transformer model with a VQC. Zhao et al [36] constructed the quantum self-attention network (QSAN) model by introducing quantum logic similarity (QLS) and quantum bit selfattention score matrix (QBSASM). Afterwards, Zhao et al [37] combined quantum kernel methods with self-attention mechanisms to propose the quantum kernel selfattention network (QKSAN) model.…”
Section: Introductionmentioning
confidence: 99%