2016
DOI: 10.1017/s0269964816000061
|View full text |Cite
|
Sign up to set email alerts
|

Fast Non-Negative Least-Squares Learning in the Random Neural Network

Abstract: The random neural network is a biologically inspired neural model where neurons interact by probabilistically exchanging positive and negative unit-amplitude signals that has superior learning capabilities compared to other artificial neural networks. This paper considers non-negative least squares supervised learning in this context, and develops an approach that achieves fast execution and excellent learning capacity. This speedup is a result of significant enhancements in the solution of the non-negative le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 50 publications
(88 reference statements)
0
2
0
Order By: Relevance
“…Several efficient algorithms to obtain the solution in Eq. 11 have been proposed in the literature [e.g., (Lawson and Hanson, 1995;Bro and De Jong, 1997;Timotheou, 2016)].…”
Section: Nonnegative Least Squaresmentioning
confidence: 99%
“…Several efficient algorithms to obtain the solution in Eq. 11 have been proposed in the literature [e.g., (Lawson and Hanson, 1995;Bro and De Jong, 1997;Timotheou, 2016)].…”
Section: Nonnegative Least Squaresmentioning
confidence: 99%
“…As a result, the NNLS problem becomes a convex optimization problem. Several efficient algorithms to obtain the solution in ( 11) have been proposed in the literature (e.g., Lawson and Hanson, 1995;Bro and DeJong, 1997;Timotheou, 2016).…”
Section: Nonnegative Least Squaresmentioning
confidence: 99%