2019
DOI: 10.3390/make1010020
|View full text |Cite
|
Sign up to set email alerts
|

Recent Advances in Supervised Dimension Reduction: A Survey

Abstract: Recently, we have witnessed an explosive growth in both the quantity and dimension of data generated, which aggravates the high dimensionality challenge in tasks such as predictive modeling and decision support. Up to now, a large amount of unsupervised dimension reduction methods have been proposed and studied. However, there is no specific review focusing on the supervised dimension reduction problem. Most studies performed classification or regression after unsupervised dimension reduction methods. However,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
38
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 87 publications
(38 citation statements)
references
References 76 publications
0
38
0
Order By: Relevance
“…DR approaches can be separated into feature selection and feature extraction (FE). The former focuses on selecting a characteristic subset while feature extraction, also referred to as feature projection, transforms the data into a representation of fewer dimensions [31]. As neither the geometry of the data nor the intrinsic dimensionality is known, DR is an ill-posed problem enforcing the assumption of certain data properties [98].…”
Section: Dimensionality Reductionmentioning
confidence: 99%
See 2 more Smart Citations
“…DR approaches can be separated into feature selection and feature extraction (FE). The former focuses on selecting a characteristic subset while feature extraction, also referred to as feature projection, transforms the data into a representation of fewer dimensions [31]. As neither the geometry of the data nor the intrinsic dimensionality is known, DR is an ill-posed problem enforcing the assumption of certain data properties [98].…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…An interesting proposal to counter this is to first increase the dimensionality to have a better feature representation and then to reduce it with RP [96]. Non-negative Matrix Factorization (NMF) is an approach, often used in domains as astronomy, that has good interpretability due to the non-negative entries in the factorized matrices [31].…”
Section: Linear Feature Extractionmentioning
confidence: 99%
See 1 more Smart Citation
“…Dimensionality reduction is one of the most popular techniques to remove noisy and unnecessary features. Dimensionality reduction techniques can be categorized into feature extraction [1] and feature selection [2]. Feature extraction approaches make combinations of original features to build new features.…”
Section: Introductionmentioning
confidence: 99%
“…Comparative full-feature dataset vs feature selected subset(1) Full features set ds5.csvROC fold 0 GNB (AUC = 0.90) ROC fold 1 GNB (AUC = 0.83) ROC fold 2 GNB (AUC = 0.76) Mean ROC GNB (AUC = 0.83 ± 0.06) ROC fold 0 LR (AUC = 0.89) ROC fold 1 LR (AUC = 0.85) ROC fold 2 LR (AUC = 0.82) Mean ROC LR (AUC = 0.85 ± 0.03) ROC fold 0 KNN (AUC = 0.85) ROC fold 1 KNN (AUC = 0.82) ROC fold 2 KNN (AUC = 0.79) Mean ROC KNN (AUC = 0.82 ± 0.02) ROC fold 0 SVC (AUC = 0.90) ROC fold 1 SVC (AUC = 0.84) ROC fold 2 SVC (AUC = 0.77) Mean ROC SVC (AUC = 0.84 ± 0.05) Chance Mean ROC (AUC = 0.84) ds5.csv ROC fold 0 GNB (AUC = 0.90) ROC fold 1 GNB (AUC = 0.85) ROC fold 2 GNB (AUC = 0.78) Mean ROC GNB (AUC = 0.84 ± 0.05) ROC fold 0 LR (AUC = 0.88) ROC fold 1 LR (AUC = 0.87) ROC fold 2 LR (AUC = 0.82) Mean ROC LR (AUC = 0.85 ± 0.03) ROC fold 0 KNN (AUC = 0.81) ROC fold 1 KNN (AUC = 0.80) ROC fold 2 KNN (AUC = 0.77) Mean ROC KNN (AUC = 0.79 ± 0.02) ROC fold 0 SVC (AUC = 0.87) ROC fold 1 SVC (AUC = 0.82) ROC fold 2 SVC (AUC = 0.74) Mean ROC SVC (AUC = 0.81 ± 0.05) Chance Mean ROC (AUC = 0.82) Stepwise feature selection set ds5.csv ROC fold 0 GNB (AUC = 0.90) ROC fold 1 GNB (AUC = 0.85) ROC fold 2 GNB (AUC = 0.79) Mean ROC GNB (AUC = 0.85 ± 0.05) ROC fold 0 LR (AUC = 0.91) ROC fold 1 LR (AUC = 0.84) ROC fold 2 LR (AUC = 0.81) Mean ROC LR (AUC = 0.86 ± 0.04) ROC fold 0 KNN (AUC = 0.85) ROC fold 1 KNN (AUC = 0.84) ROC fold 2 KNN (AUC = 0.78) Mean ROC KNN (AUC = 0.82 ± 0.03) ROC fold 0 SVC (AUC = 0.91) ROC fold 1 SVC (AUC = 0.84) ROC fold 2 SVC (AUC = 0.79) Mean ROC SVC (AUC = 0.85 ± 0.05) Chance Mean ROC (AUC = 0.84) Full features set ds6.csv ROC fold 0 GNB (AUC = 0.50) ROC fold 1 GNB (AUC = 1.00) ROC fold 2 GNB (AUC = 0.81) Mean ROC GNB (AUC = 0.77 ± 0.21) ROC fold 0 LR (AUC = 0.79) ROC fold 1 LR (AUC = 0.96) ROC fold 2 LR (AUC = 0.76) Mean ROC LR (AUC = 0.84 ± 0.09) ROC fold 0 KNN (AUC = 0.67) ROC fold 1 KNN (AUC = 1.00) ROC fold 2 KNN (AUC = 0.76) Mean ROC KNN (AUC = 0.81 ± 0.14) ROC fold 0 SVC (AUC = 0.96) ROC fold 1 SVC (AUC = 1.00) ROC fold 2 SVC (AUC = 0.86) Mean ROC SVC (AUC = 0.94 ± 0.06) Chance Mean ROC (AUC = 0.84) ds6.csv ROC fold 0 GNB (AUC = 0.50) ROC fold 1 GNB (AUC = 1.00) ROC fold 2 GNB (AUC = 0.81) Mean ROC GNB (AUC = 0.77 ± 0.21) ROC fold 0 LR (AUC = 0.96) ROC fold 1 LR (AUC = 1.00) ROC fold 2 LR (AUC = 0.90) Mean ROC LR (AUC = 0.95 ± 0.04) ROC fold 0 KNN (AUC = 0.67) ROC fold 1 KNN (AUC = 0.96) ROC fold 2 KNN (AUC = 0.86) Mean ROC KNN (AUC = 0.82 ± 0.12) ROC fold 0 SVC (AUC = 0.79) ROC fold 1 SVC (AUC = 0.96) ROC fold 2 SVC (AUC = 0.90) Mean ROC SVC (AUC = 0.88 ± 0.07) Chance Mean ROC (AUC = 0.86) Stepwise feature selection set ds6.csv ROC fold 0 GNB (AUC = 0.50) ROC fold 1 GNB (AUC = 1.00) ROC fold 2 GNB (AUC = 0.88) Mean ROC GNB (AUC = 0.79 ± 0.21) ROC fold 0 LR (AUC = 0.96) ROC fold 1 LR (AUC = 1.00) ROC fold 2 LR (AUC = 0.86) Mean ROC LR (AUC = 0.93 ± 0.06) ROC fold 0 KNN (AUC = 0.67) ROC fold 1 KNN (AUC = 1.00) ROC fold 2 KNN (AUC = 0.86) Mean ROC KNN (AUC = 0.84 ± 0.14) ROC fold 0 SVC (AUC = 0.88) ROC fold 1 SVC (AUC = 1.00) ROC fold 2 SVC (AUC = 0.86) Mean ROC SVC (AUC = 0.91 ± 0.06) Chance Mean ROC (AUC = 0.87) Full features set ds7.csv ROC fold 0 GNB (AUC = 0.73) ROC fold 1 GNB (AUC = 0.57) ROC fold 2 GNB (AUC = 0.59) Mean ROC GNB (AUC = 0.63 ± 0.07) ROC fold 0 LR (AUC = 0.77) ROC fold 1 LR (AUC = 0.61) ROC fold 2 LR (AUC = 0.75) Mean ROC LR (AUC = 0.71 ± 0.07) ROC fold 0 KNN (AUC = 0.60) ROC fold 1 KNN (AUC = 0.53) ROC fold 2 KNN (AUC = 0.59) Mean ROC KNN (AUC = 0.57 ± 0.03) ROC fold 0 SVC (AUC = 0.68) ROC fold 1 SVC (AUC = 0.71) ROC fold 2 SVC (AUC = 0.67) Mean ROC SVC (AUC = 0.69 ± 0.02) Chance Mean ROC (AUC = 0.65) ds7.csv ROC fold 0 GNB (AUC = 0.79) ROC fold 1 GNB (AUC = 0.56) ROC fold 2 GNB (AUC = 0.62) Mean ROC GNB (AUC = 0.66 ± 0.10) ROC fold 0 LR (AUC = 0.79) ROC fold 1 LR (AUC = 0.72) ROC fold 2 LR (AUC = 0.76) Mean ROC LR (AUC = 0.76 ± 0.03) ROC fold 0 KNN (AUC = 0.71) ROC fold 1 KNN (AUC = 0.59) ROC fold 2 KNN (AUC = 0.65) Mea...…”
mentioning
confidence: 99%