Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2016
DOI: 10.1016/j.neucom.2015.05.139
|View full text |Cite
|
Sign up to set email alerts
|

Weightless neural network parameters and architecture selection in a quantum computer

Abstract: Training artificial neural networks requires a tedious empirical evaluation to determine a suitable neural network architecture. To avoid this empirical process several techniques have been proposed to automatise the architecture selection process. In this paper, we propose a method to perform parameter and architecture selection for a quantum weightless neural network (qWNN). The architecture selection is performed through the learning procedure of a qWNN with a learning algorithm that uses the principle of q… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…QWLNN [59] mentions QWLNN in 2008. [60] defines a QWLNN architecture learning algorithm based on quantum superposition. The architecture and parameters of this model depend on many factors such as the number of training modes, the structure of the selector, etc.…”
Section: H Othersmentioning
confidence: 99%
See 1 more Smart Citation
“…QWLNN [59] mentions QWLNN in 2008. [60] defines a QWLNN architecture learning algorithm based on quantum superposition. The architecture and parameters of this model depend on many factors such as the number of training modes, the structure of the selector, etc.…”
Section: H Othersmentioning
confidence: 99%
“…In the recent quantum processor stage, some emerging models such as QBM [29]- [31], QCVNN [32]- [34], QGAN [35] [36], QGNN [37]- [39], QRNN [40] [41], QTNN [42], QP [43]- [49], etc. [50]- [60] will be introduced in subsequent sections.…”
Section: Introductionmentioning
confidence: 99%
“…Training WNNs involves a different approach compared to traditional neural networks. WNNs focus on the activation of concepts rather than the activation of individual neurons [14]. This means that instead of adjusting the weights of connections between neurons during training, the weights of concepts and their associations are adjusted.…”
Section: Introductionmentioning
confidence: 99%
“…Better understanding the many classes of quantum algorithms and their behaviors might be the first step toward an approach to resolve huge computational problems in emerging computing areas, such as artificial intelligence and robotics, big data analysis, bioinformatics, and cyber security [ 12 ].…”
Section: Introductionmentioning
confidence: 99%