2021
DOI: 10.1371/journal.pone.0248896
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive dimensionality reduction for neural network-based online principal component analysis

Abstract: “Principal Component Analysis” (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a large portion of the data variance. The problem of finding the optimal number of principal components has been widely studied for offline PCA. However, when working with streaming data, the optimal number changes continuousl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(7 citation statements)
references
References 47 publications
(103 reference statements)
0
7
0
Order By: Relevance
“…The proposed method investigates the clusters with the performance metrics given in the equations 5 to 7, Comparing with the existing methods which are discussed in the review literature. (2) 3 75.34 84.24 83.56 DBSCAN (5) 5,7 83.15 89.00 90.31 GSVM (6) 5 79.76 90.45 92.22 SOM (8) 5 75.00 78.73 93.51 PCA (11) 3, 5 85.69 94.88 93.11 ocSVM (15) 3, 5 80.23 92.57 90.55 ICA (20) 3 From Table 3, K-Means process the three different cluster classes to find the accuracy level to 83.56%, the energy consumption rate is low to compute the cluster set. But the Algorithm improves the classification to 84.24% comparing the other network functionality to fabricate the attacking nodes.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed method investigates the clusters with the performance metrics given in the equations 5 to 7, Comparing with the existing methods which are discussed in the review literature. (2) 3 75.34 84.24 83.56 DBSCAN (5) 5,7 83.15 89.00 90.31 GSVM (6) 5 79.76 90.45 92.22 SOM (8) 5 75.00 78.73 93.51 PCA (11) 3, 5 85.69 94.88 93.11 ocSVM (15) 3, 5 80.23 92.57 90.55 ICA (20) 3 From Table 3, K-Means process the three different cluster classes to find the accuracy level to 83.56%, the energy consumption rate is low to compute the cluster set. But the Algorithm improves the classification to 84.24% comparing the other network functionality to fabricate the attacking nodes.…”
Section: Resultsmentioning
confidence: 99%
“…Migenda N et al (11) have suggested dimensionality lessening methods like PCA with numerical controller charts for guided wave-based unsupervised damage assessment. This method makes use of PCA as a feature separator to decrease the dimensionality of the input space.…”
Section: Introductionmentioning
confidence: 99%
“…Spearman correlation was measured between independent variables. Another method of applying explanatory statistics included PCA to orthogonalize the high-dimensionality original dataset into useful latent hits with eigenvalues equal to 1 or greater by KMO test [36]. All analyses were carried out in the environment of R-project for statistical computing and graphics [37].…”
Section: Discussionmentioning
confidence: 99%
“…A box-plot diagram was elaborated to describe the radicle-to-hypocotyl ratio for samples from ecotoxicological bioassay, separating them using the post-hoc Tukey’s test. Principal component analysis 16 was conducted to extract functional relationships between respirometry and ecotoxicology. All analyses were performed in the environment of the software R-project for statistical computing and graphics 17 .…”
Section: Methodsmentioning
confidence: 99%