2019
DOI: 10.1016/j.neucom.2019.08.069
|View full text |Cite
|
Sign up to set email alerts
|

Extreme learning machine with local connections

Abstract: This paper is concerned with the sparsification of the input-hidden weights of ELM (Extreme Learning Machine). For ordinary feedforward neural networks, the sparsification is usually done by introducing certain regularization technique into the learning process of the network. But this strategy can not be applied for ELM, since the input-hidden weights of ELM are supposed to be randomly chosen rather than to be learned. To this end, we propose a modified ELM, called ELM-LC (ELM with local connections), which i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 36 publications
0
4
0
Order By: Relevance
“…The average RMSE on testing data for a 4 sec period was 2.2335 × 10 −3 which was less compared to 4.0175 × 10 −3 for OS-ELM. The authors of [24] proposed a modified ELM with local connections (ELM-LC) which is designed for sparsifying the input-hidden weights. The input and hidden nodes are made into groups, and a group of input nodes is connected to only one group of hidden node groups, and hidden-output weights are calculated by a least square learning method.…”
Section: Extreme Learning Machinesmentioning
confidence: 99%
“…The average RMSE on testing data for a 4 sec period was 2.2335 × 10 −3 which was less compared to 4.0175 × 10 −3 for OS-ELM. The authors of [24] proposed a modified ELM with local connections (ELM-LC) which is designed for sparsifying the input-hidden weights. The input and hidden nodes are made into groups, and a group of input nodes is connected to only one group of hidden node groups, and hidden-output weights are calculated by a least square learning method.…”
Section: Extreme Learning Machinesmentioning
confidence: 99%
“…Based on the above index system, in order to further improve the accuracy of classroom teaching evaluation, Based on limit learning machine (extreme learning machine, ELM) constructs the objective function. Elm network has the advantages of simple structure and fast learning speed, and uses Moore Penrose generalized inverse to solve the network weight and obtain a smaller weight norm, which can avoid many problems caused by cuckoo search algorithm, such as too many local minimum iterations, determination of performance index and learning rate, and obtain good results Network generalization performance of [ 17 ]. The model structure is shown in Fig.…”
Section: Evaluation Model Of Intelligent Teaching Effect Based On Cs-...mentioning
confidence: 99%
“…The development of convolutional neural networks provides the theoretical basis and technical support for carrying out research on the identification of gears and gear surface damage in this paper. 7 Convolutional neural networks utilize four key ideas: sparse connectivity, 8 weight sharing, 9 pooling, 10 and multiple network layers. 11 Convolutional neural networks are widely used in detection, segmentation, object recognition, and various areas of image analysis, 12 which can recognize gears and their surface damage in images at the pixel level.…”
Section: Introductionmentioning
confidence: 99%
“…Convolutional neural networks utilize four key ideas: sparse connectivity, 8 weight sharing, 9 pooling, 10 and multiple network layers. 11 Convolutional neural networks are widely used in detection, segmentation, object recognition, and various areas of image analysis, 12 which can recognize gears and their surface damage in images at the pixel level.…”
Section: Introductionmentioning
confidence: 99%