2018
DOI: 10.3390/sym10100474
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary Hierarchical Sparse Extreme Learning Autoencoder Network for Object Recognition

Abstract: Extreme learning machine (ELM), characterized by its fast learning efficiency and great generalization ability, has been applied to various object recognition tasks. When extended to the stacked autoencoder network, which is a typical symmetrical representation learning model architecture, ELM manages to realize hierarchical feature extraction and classification, which is what deep neural networks usually do, but with much less training time. Nevertheless, the input weights and biases of the hidden nodes in EL… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 30 publications
0
8
0
Order By: Relevance
“…Inspired by polynomial approximation of non-linear functions, we use a multipath and piecewise linear approximation in the design of the fast sparse DNN (FSDNN) network, thus combining the hidden layer weight matrix and biased random sampling with sparsity, and further design the corresponding low-time-consuming iterative optimization algorithm. In addition, the proposed FSDNN in this paper is different from the previous models [28]- [30] in terms of the core modules of the network, framework design and optimization algorithm. The main contributions of this study can be summarized as follows.…”
Section: Introductionmentioning
confidence: 89%
“…Inspired by polynomial approximation of non-linear functions, we use a multipath and piecewise linear approximation in the design of the fast sparse DNN (FSDNN) network, thus combining the hidden layer weight matrix and biased random sampling with sparsity, and further design the corresponding low-time-consuming iterative optimization algorithm. In addition, the proposed FSDNN in this paper is different from the previous models [28]- [30] in terms of the core modules of the network, framework design and optimization algorithm. The main contributions of this study can be summarized as follows.…”
Section: Introductionmentioning
confidence: 89%
“…We present the schematic illustration process of the work proposed iDRP framework, including the realization of the universal approximations of Multilayer Perceptron (MLP) with multiple hidden layers by exploiting the capabilities of H-ELM and other related techniques [30]- [33], which have the preeminent underexplored potentials for accelerated speed, rapid feature learning, and improved classification performance [30], [31], [34], [35]. Figure 1 Shows the schematic illustration of the setup of the experimental workflow process of the iDRP framework.…”
Section: Methodsmentioning
confidence: 99%
“…The Data Augmentation techniques and propagation techniques [30], [31], [37], which helped with the 'fine-grained extraction and classification' of signatures of complex datasets were applied to the model implementation of the proposed novel iDRP framework. The applied research approach for the development of the iDRP framework is implemented using Python Programming language (https://www.python.org/), which is an open-source language with rich and extensive libraries that is easily accessible to the general masses.…”
Section: Conceptual Implementation Of Proposed Frameworkmentioning
confidence: 99%
See 2 more Smart Citations