2018 25th IEEE International Conference on Image Processing (ICIP) 2018
DOI: 10.1109/icip.2018.8451538
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Representation Wavelet Based Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…However, these methods are not resistant to training samples degradation by high-amplitude noise or occlusions. A recent approach, namely Sparse Representation Wavelet based Classification (SRWC) [20] was developed in the wavelet domain, taking advantage of the wavelet-promoted sparsity and leading to better discrimination. To improve the image classification accuracy, the SRWC exploits the sparse representation of the features described by the complementary information from the low-frequency sub-band of the wavelet coefficients.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…However, these methods are not resistant to training samples degradation by high-amplitude noise or occlusions. A recent approach, namely Sparse Representation Wavelet based Classification (SRWC) [20] was developed in the wavelet domain, taking advantage of the wavelet-promoted sparsity and leading to better discrimination. To improve the image classification accuracy, the SRWC exploits the sparse representation of the features described by the complementary information from the low-frequency sub-band of the wavelet coefficients.…”
Section: Related Workmentioning
confidence: 99%
“…As a consequence, the SRWC method exhibited higher accuracy compared to the contemporary SRbased methods in [9], [11], [12]. In [20], an overcomplete dictionary is constructed by mapping the training samples into the wavelet domain. To determine the membership of the query test sample, its sparse vector is first computed through an l 1 -norm minimization problem.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus, NNs are useful to modeling complex relationships between inputs and outputs or to find patterns in data with an indeterminant number of parameters or with parameters whose features cannot easily be modeled using traditional regression and statistical methods. In this kind of abilities, the NNs are analogous to the sparse representation classification methodology [7] which learns and uses large dictionaries.…”
Section: Introductionmentioning
confidence: 99%