2018 IEEE International Conference on Big Data and Smart Computing (BigComp) 2018
DOI: 10.1109/bigcomp.2018.00043
|View full text |Cite
|
Sign up to set email alerts
|

Local Standard Deviation Spectral Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…When reading duplicate fles, since the index information has been added to the hash mapping object in memory at the frst reading, it is more efcient to query the index information directly from the hash map in memory at the second reading. In this section, three spectral clustering algorithms are analyzed experimentally, including traditional spectral clustering, Xie et al's [34] improved spectral clustering algorithm, and Zuo's improved spectral clustering algorithm. SEEDS dataset is selected from UCI data as experimental data.…”
Section: Read Speed Testmentioning
confidence: 99%
“…When reading duplicate fles, since the index information has been added to the hash mapping object in memory at the frst reading, it is more efcient to query the index information directly from the hash map in memory at the second reading. In this section, three spectral clustering algorithms are analyzed experimentally, including traditional spectral clustering, Xie et al's [34] improved spectral clustering algorithm, and Zuo's improved spectral clustering algorithm. SEEDS dataset is selected from UCI data as experimental data.…”
Section: Read Speed Testmentioning
confidence: 99%
“…Moreover, the parameter λ is not only a correction parameter but can also express the local details of the visible image. In order to improve the image contrast, we adopt the local standard deviation (LSD) [38] as the adaptive regularization parameter, which can be defined as follows:…”
Section: Fusion Of Base Componentsmentioning
confidence: 99%
“…To test the superiority and feasibility of the algorithms, we chose the comparison methods from the various aspects, such as the methods focused on the image transformation, the methods based on sparse representation or filtering methods, the deep learning methods and so on. The comparison methods in this paper mainly include curvelet transform (CVT) [40], complex wavelet transform (CWT) [41], guided filtering fusion (GFF) [38], gradient transform (GTF) [42], hybrid multi-scale decomposition fusion (HMSD) [43], Laplacian pyramid with sparse representation (LP_SR) [44], ratio pyramid (RP) [45], fusion based on median filtering (TSF) [46], the weighted least square optimization-based method (WLS) [47], anisotropic diffusion fusion (ADF) [48], U2fusion [26] and Densefuse [25]. The implementation of these compared methods is publicly available, the parameters of which are strictly in accord with the original papers.…”
Section: Experimental Settingmentioning
confidence: 99%
“…Besides, ACE will adjust the contrast gain of the image into suitable value. The general equation [12] is defined as  …”
Section: Segmentation Using Local Standard Deviationmentioning
confidence: 99%