2007
DOI: 10.1142/s0218488507004881
|View full text |Cite
|
Sign up to set email alerts
|

A Neuro-Fuzzy Model for Dimensionality Reduction and Its Application

Abstract: A novel neuro-fuzzy approach to nonlinear dimensionality reduction is proposed. The approach is an auto-associative modification of the Neuro-Fuzzy Kolmogorov's Network (NFKN) with a "bottleneck" hidden layer. Two training algorithms are considered. The validity of theoretical results and the advantages of the proposed model are confirmed by an experiment in nonlinear principal component analysis and an application in the visualization of high-dimensional wastewater treatment plant data.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…ey used Lyapunov exponents for feature extraction, and ANFIS is used for classification. Kolodyazhniy et al [55] used PCA for dimension reduction and NF Kolmogorov's network for classification for waste water treatment plant data. Schclar et al [56] ensemble various models based on the dimensionality reduction.…”
Section: Literature Surveymentioning
confidence: 99%
“…ey used Lyapunov exponents for feature extraction, and ANFIS is used for classification. Kolodyazhniy et al [55] used PCA for dimension reduction and NF Kolmogorov's network for classification for waste water treatment plant data. Schclar et al [56] ensemble various models based on the dimensionality reduction.…”
Section: Literature Surveymentioning
confidence: 99%
“…and nh tuned synaptic weights [2] ilj w . In total, the autoencoder contains 2nmh tuned synaptic weights and ( ) n m h + membership functions that is significantly less than in the systems, which were described in [14][15][16][17].…”
Section: Autoencoder Based On Gnfnsmentioning
confidence: 99%
“…In distinction from autoencoders, which are described in [14][15][16][17] the proposed system contains less number of the membership functions (it reduces its computational implementation) and has a high speed of learning algorithm due to the optimized choice of learning rate parameters.…”
Section: The Learning Of the Gnfn Autoencodermentioning
confidence: 99%
See 1 more Smart Citation