2015
DOI: 10.1007/s00500-015-1949-1
|View full text |Cite
|
Sign up to set email alerts
|

On randomization of neural networks as a form of post-learning strategy

Abstract: Today artificial neural networks are applied in various fields -engineering, data analysis, robotics. While they represent a successful tool for a variety of relevant applications, mathematically speaking they are still far from being conclusive. In particular, they suffer from being unable to find the best configuration possible during the training process (local minimum problem). In this paper, we focus on this issue and suggest a simple, but effective, post-learning strategy to allow the search for improved… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…Recent years saw the emergence of a new post-learning method that improves set of weights based on analogy with quantum effects occurring in nature. During the analogy, a biological neuron is simulated as a semiconductor heterostructure involving one energetic barrier sandwiched between two energetically lower areas (Kapanova et al, 2017). Compared to the traditional model, this new algorithm can be achieved with the minimal additional computing costs.…”
Section: Introductionmentioning
confidence: 99%
“…Recent years saw the emergence of a new post-learning method that improves set of weights based on analogy with quantum effects occurring in nature. During the analogy, a biological neuron is simulated as a semiconductor heterostructure involving one energetic barrier sandwiched between two energetically lower areas (Kapanova et al, 2017). Compared to the traditional model, this new algorithm can be achieved with the minimal additional computing costs.…”
Section: Introductionmentioning
confidence: 99%