2017
DOI: 10.1016/j.knosys.2016.11.020
|View full text |Cite
|
Sign up to set email alerts
|

A dimension reduction algorithm preserving both global and local clustering structure

Abstract: By combining linear discriminant analysis and Kmeans into a coherent framework, a dimension reduction algorithm was recently proposed to select the most discriminative subspace. This algorithm utilized the clustering method to generate cluster labels and after that employed discriminant analysis to do subspace selection. However, we found that this algorithm only considers the information of global structure, and does not take into account the information of local structure. In order to overcome the shortcomin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…This issue is inadequately addressed by older methods due to limitations of their dimensionality reduction algorithms. More recent techniques still experience important limitations in this regard, such as the need for a 'balancing' parameter that may crucially impact the structures preserved (e.g., UMAP, GLSPP [72]), or the imposition of specific metrics that limit their application to other fields (e.g., PHATE, DGL [73]).…”
Section: Discussionmentioning
confidence: 99%
“…This issue is inadequately addressed by older methods due to limitations of their dimensionality reduction algorithms. More recent techniques still experience important limitations in this regard, such as the need for a 'balancing' parameter that may crucially impact the structures preserved (e.g., UMAP, GLSPP [72]), or the imposition of specific metrics that limit their application to other fields (e.g., PHATE, DGL [73]).…”
Section: Discussionmentioning
confidence: 99%
“…However, this method of reducing data dimensionality is simple to calculate and guarantees the generation of accurate representations of high-dimensional datasets in lower dimensions. The following is the formula for Principal Component Analysis (PCA) [69].…”
Section: Principal Component Analysismentioning
confidence: 99%
“…Dimensionality reduction, which projects original features into a lower dimensional space, has been a prevalent technique in dealing with high dimensional datasets, because it is able to remove redundant features, reduce memory usage, avoid the curse of dimensionality and improve efficiency of machine learning algorithm. As a preprocessing step, dimensionality reduction has been applied to a variety of problems including k-means clustering [1,2,3], support vector machines classification [4,5,6,7], k-nearest neighbors classification [8], least squares regression, and low rank approximation [9]. However, how to design efficient and effective dimensionality reduction algorithm is a serious challenge problem.…”
Section: Introductionmentioning
confidence: 99%