2010 IEEE Aerospace Conference 2010
DOI: 10.1109/aero.2010.5446692
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of data reduction techniques based on the performance of SVM-type classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…Here, a series of spectra are simultaneously compared through the measure of the covariance (see for example fig 3). This multivariate approach also allows revealing the correlations between different spectral features and consequently is routinely used for data reduction [36]. Indeed PCA provides new data, which are linear combinations of the initial data.…”
Section: Identification Of Samplesmentioning
confidence: 99%
“…Here, a series of spectra are simultaneously compared through the measure of the covariance (see for example fig 3). This multivariate approach also allows revealing the correlations between different spectral features and consequently is routinely used for data reduction [36]. Indeed PCA provides new data, which are linear combinations of the initial data.…”
Section: Identification Of Samplesmentioning
confidence: 99%
“…They use filtering algorithm for reduction of transmitted data for increasing the lifetime of the network [2]. Ramona Georgescu, Christian R. Berger, et.al, proposed 4 data reduction techniques such as Principal Component Analysis (PCA), Partial Least Square (PLS), Structurally Random Matrices (SRM), and Orthogonal Matching Pursuit OMP) [6]. Extensive research has been performed on different image and text compression techniques.…”
Section: Introductionmentioning
confidence: 99%
“…In PSVM, the separating planes are not bounding planes, but can be thought of as proximal planes, around which the instances of each class are clustered, and which are pushed as far apart as possible [16]. This formulation can also be interpreted as regularized least squares.…”
Section: Proximal Support Vector Machinementioning
confidence: 99%
“…Recently, structurally random matices (SRMs) have been proposed as a fast and highly efficient compressed sensing method that guarantees optimal performance [16].…”
Section: Structurally Random Matricesmentioning
confidence: 99%
See 1 more Smart Citation