Neural Networks and Statistical Learning 2019
DOI: 10.1007/978-1-4471-7452-3_11
|View full text |Cite
|
Sign up to set email alerts
|

Radial Basis Function Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 100 publications
0
2
0
Order By: Relevance
“…One of the learning algorithm approaches is just applying a BP algorithm as it is done in MLP. The most widely known algorithm is the stochastic gradient descent (RBFN‐SGD) 21–23 . The difference in relation to ANNs lies in the fact that the RBFN has different parameters to adapt throughout the learning process.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…One of the learning algorithm approaches is just applying a BP algorithm as it is done in MLP. The most widely known algorithm is the stochastic gradient descent (RBFN‐SGD) 21–23 . The difference in relation to ANNs lies in the fact that the RBFN has different parameters to adapt throughout the learning process.…”
Section: Methodsmentioning
confidence: 99%
“…In RBFN‐SGD, all parameters are adapted simultaneously, whereas another approach is using a two‐stage strategy 21,23,24 . In the first stage, unsupervised techniques, such as clustering algorithms (e.g., k‐means), are performed with the purpose of computing the centers and, later, the widths.…”
Section: Methodsmentioning
confidence: 99%