2013
DOI: 10.7763/ijcte.2013.v5.729
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Recursive Least Squares for the CMAC Neural Network

Abstract: Abstract-The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. The standard CMAC uses the least mean squares algorithm to train the weights. Recently, the recursive least squares algorithm was proposed as a superior algorithm for training the CMAC online as it can converge in one epoch, and does not require tuning of a learning rate. However, the RLS algorithms computational speed is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 9 publications
(14 reference statements)
0
2
0
Order By: Relevance
“…The gradient operator calculates direction to the minimum location of the performance function of � that, in this case, estimated as el, that is �=e { The convergence parameter is the step-size of adaptation to the minimum of the performance function of�. By applying (7) to (8), the gradient operator changes (9) to be as follows (10) After adaptation in the k th iteration, the network output can be expressed as follows (11 ) and the error is as below ( 12) The error is reduced if the convergence parameter Il is in the interval of (13)…”
Section: Output Mappingmentioning
confidence: 99%
See 1 more Smart Citation
“…The gradient operator calculates direction to the minimum location of the performance function of � that, in this case, estimated as el, that is �=e { The convergence parameter is the step-size of adaptation to the minimum of the performance function of�. By applying (7) to (8), the gradient operator changes (9) to be as follows (10) After adaptation in the k th iteration, the network output can be expressed as follows (11 ) and the error is as below ( 12) The error is reduced if the convergence parameter Il is in the interval of (13)…”
Section: Output Mappingmentioning
confidence: 99%
“…In the study, the improvement occurred but the convergence aspects which related to the respective application constraints need to be investigated further. Kernel Recursive Least Squares Algorithm, which shown in [9], [10], [11], is used to improve the CMAC training through a utilization of the inverse QR decomposition [12] in the weight updating rule to increase the column independence of the basis function matrix, iteratively. The improvement is achieved but the need of computationally resources is still to be a problem.…”
Section: Introductionmentioning
confidence: 99%