2019
DOI: 10.1109/tnnls.2018.2866622
|View full text |Cite
|
Sign up to set email alerts
|

Universal Approximation Capability of Broad Learning System and Its Structural Variations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
126
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 391 publications
(145 citation statements)
references
References 26 publications
0
126
0
Order By: Relevance
“…For example, Zhang et al [30] provided evidence about how face recognition with the help of BLS could remain unaffected by the number of facial features with intense illumination and occlusion and meanwhile maintaining a prominent accuracy. Chen et al [31] unraveled the superiority of BLS and its variants to several existing learning algorithms in time series prediction and performance regression on face recognition database. Compared with other classic structures, the efficiency and effectiveness of the BLS variants have been fully testified.…”
Section: Broad Learning Systemmentioning
confidence: 99%
“…For example, Zhang et al [30] provided evidence about how face recognition with the help of BLS could remain unaffected by the number of facial features with intense illumination and occlusion and meanwhile maintaining a prominent accuracy. Chen et al [31] unraveled the superiority of BLS and its variants to several existing learning algorithms in time series prediction and performance regression on face recognition database. Compared with other classic structures, the efficiency and effectiveness of the BLS variants have been fully testified.…”
Section: Broad Learning Systemmentioning
confidence: 99%
“…In contrast to deep learning that consumes expensive computational cost for network training, the broad learning architecture offers a computationally efficient alternative Chen et al, 2019). In the typical conventional deep learning architecture (Goodfellow et al, 2016;Schmidhuber, 2015), the neural networks are configured with stacks of hierarchical layers and the modification of the architec-ture requires retraining of the entire network.…”
Section: Nonparametric Spatial Modelingmentioning
confidence: 99%
“…Lary, Alavi, Gandomi, and Walker (2016) compared the features of some machine learning techniques and their performance on geosciences problems. Chen, Liu, and Feng (2019) compared four machine learning techniques, including the Bayes' net, radial basis function classifier, logistic model tree, and random forest models, for landslide susceptibility modeling. Nabian and Meidani (2018) presented a deep learning framework for seismic reliability analysis of transportation network.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Also, the structure of this network can be expanded flexibly in a wide sense. Like a deep structure neural network, the approximation capability of an incremental learning network is universal (Chen et al, 2018). Hence, it has been successfully engaged in different fields employing efficient modeling and fast learning ability.…”
Section: Introductionmentioning
confidence: 99%