2018
DOI: 10.1016/j.sigpro.2018.02.010
|View full text |Cite
|
Sign up to set email alerts
|

A review of sparsity-based clustering methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 34 publications
(18 citation statements)
references
References 82 publications
0
18
0
Order By: Relevance
“…For the moment, to the best of available knowledge, just a small number of studies relate this problem in a sparse and redundant representations perspective. Some related research samples can be given but no more than in [284,285]. By nature, the sparsity property strongly holds in most real world examples.…”
Section: 4mentioning
confidence: 99%
“…For the moment, to the best of available knowledge, just a small number of studies relate this problem in a sparse and redundant representations perspective. Some related research samples can be given but no more than in [284,285]. By nature, the sparsity property strongly holds in most real world examples.…”
Section: 4mentioning
confidence: 99%
“…Sparse representations have received an increasing attention in many signal and image processing applications. These applications include denoising [21][22][23] , classification [24][25][26] or pattern recognition [27][28][29] . The use of sparse representations for AD is more original and has been considered in less applications such as hyperspectral imaging [30] , detection of abnormal motions in videos [31] , irregular heartbeat detection in electrocardiograms (ECG) or specular reflectance and shadow removal in natural images [15] .…”
Section: Sparse Representations and Dictionary Learningmentioning
confidence: 99%
“…Several algorithms are used for clustering analysis, and they can be roughly divided into four categories [38]: (1) those based on cluster formation methodology, such as top-down, bottom-up, and analytical optimization techniques [39]; (2) those dependent on the cluster model obtained, such as stratification, centroids (e.g., K-means), distribution subspaces, and graph-based models; (3) those obtained via a membership function, which may be further subdivided into hard or soft clustering [40]; and (4) those that use groups to define the distinction between overlapping clusters and are less sensitive to noise because it becomes equally distributed among them [41].…”
Section: Seasonal Clustering Approachmentioning
confidence: 99%