2015 International Joint Conference on Neural Networks (IJCNN) 2015
DOI: 10.1109/ijcnn.2015.7280318
|View full text |Cite
|
Sign up to set email alerts
|

The on-line curvilinear component analysis (onCCA) for real-time data reduction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 6 publications
0
13
0
Order By: Relevance
“…Finally, to address the issues regarding the curse of dimensionality, effective Dimensionality Reduction (DR) techniques, which allow also for online performance, could be required [111]. The Growing Curvilinear Component Analysis (GCCA) [113] enables a non-linear distance preserving reduction technique [112], by means of a self-organized incremental neural network architecture, leading to applications in life sciences [39].…”
Section: Discussionmentioning
confidence: 99%
“…Finally, to address the issues regarding the curse of dimensionality, effective Dimensionality Reduction (DR) techniques, which allow also for online performance, could be required [111]. The Growing Curvilinear Component Analysis (GCCA) [113] enables a non-linear distance preserving reduction technique [112], by means of a self-organized incremental neural network architecture, leading to applications in life sciences [39].…”
Section: Discussionmentioning
confidence: 99%
“…the data projection, is determined as in CCA. On the contrary, if the data fails the novelty test, the firstwinner and its neighbors adapt their weight vectors in the X space by means of the Soft Competitive Learning (SCL [33,34]); as in the previous case, their projections are updated as in CCA.…”
Section: The Growing Curvilinear Component Analysismentioning
confidence: 99%
“…A seed is a couple of neurons made of a neuron and its double, whose weight vector is given by the hard competitive learning, HCL [33,34]. Neuron-doubling is done every time the first-winner is the top of a bridge departing from the second-close neuron (i.e.…”
Section: The Growing Curvilinear Component Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…CCA (Demartines & Herault, ; Cirrincione et al, ) is a non‐linear projection method that preserves distance relationships in both input and output spaces. CCA is a useful method for redundant and non‐linear data structure representations and can be used in dimensionality reduction.…”
Section: Techniques Applied To Validate the Proposed Modelmentioning
confidence: 99%