2013
DOI: 10.1007/978-3-642-29764-9_8
|View full text |Cite
|
Sign up to set email alerts
|

A Connection between Extreme Learning Machine and Neural Network Kernel

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…In elmNNR, complexity of model selection is determined by the number of parameters (weight variance, number of hidden units...) and the ranges used for them. The complexity is O(H 3 ), where H is the number of hidden units [57]. CUBIST is a rule-based model that is an extension of Quinlan's M5 model tree.…”
Section: Regression Modelsmentioning
confidence: 99%
“…In elmNNR, complexity of model selection is determined by the number of parameters (weight variance, number of hidden units...) and the ranges used for them. The complexity is O(H 3 ), where H is the number of hidden units [57]. CUBIST is a rule-based model that is an extension of Quinlan's M5 model tree.…”
Section: Regression Modelsmentioning
confidence: 99%
“…The greatest disadvantage of a standard ELM is that a non-linear transformation with randomly selected parameters may be unable to represent all important features of the input space. Hence, a large number of neurons is necessary, which generates the numerical problems described above and in [39][40][41][42][43][44].…”
Section: Elm With Input-dependent Output Weights 41 New Network Structurementioning
confidence: 99%