2019
DOI: 10.1186/s12859-019-3229-z
|View full text |Cite
|
Sign up to set email alerts
|

PCA via joint graph Laplacian and sparse constraint: Identification of differentially expressed genes and sample clustering on gene expression data

Abstract: BackgroundIn recent years, identification of differentially expressed genes and sample clustering have become hot topics in bioinformatics. Principal Component Analysis (PCA) is a widely used method in gene expression data. However, it has two limitations: first, the geometric structure hidden in data, e.g., pair-wise distance between data points, have not been explored. This information can facilitate sample clustering; second, the Principal Components (PCs) determined by PCA are dense, leading to hard interp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 33 publications
(32 reference statements)
0
10
0
Order By: Relevance
“…Performance Comparison with Other Related Methods. In this section, we will compare the performance of RLSDSPCA with other related methods (PCA, 6 gLPCA, 16 gLSPCA, 17 RgLPCA, 16 and SDSPCA 12 ) in characteristic gene selection and tumor classification to further verify the effectiveness of the proposed method.…”
Section: Robustmentioning
confidence: 99%
See 1 more Smart Citation
“…Performance Comparison with Other Related Methods. In this section, we will compare the performance of RLSDSPCA with other related methods (PCA, 6 gLPCA, 16 gLSPCA, 17 RgLPCA, 16 and SDSPCA 12 ) in characteristic gene selection and tumor classification to further verify the effectiveness of the proposed method.…”
Section: Robustmentioning
confidence: 99%
“…At the same time, they have also developed a robust version of gLPCA, called robust gLPCA (RgLPCA), which applies the L2,1 norm instead of the Frobenius norm to the reconstruction term to reduce the influence of outliers and noise. 16 Feng et al added L2,1 sparse constraints on the basis of gLPCA and proposed gLSPCA, 17 which can obtain low-dimensional data structural information and improve the interpretability of principal components simultaneously.…”
Section: ■ Introductionmentioning
confidence: 99%
“…Secondly, because it is unclear which feature is the best, many features could be extracted for the target and the drug in the same time [ 16 18 ], and then some dimensional reduction methods have been proposed for DTI [ 19 23 ]. Ezzat et al proposed a framework for DTI prediction by leveraging both feature dimensionality reduction and ensemble learning [ 19 ].…”
Section: Introductionmentioning
confidence: 99%
“…Mahmud et al predicted DTI based on protein features with under sampling and feature selection techniques with boosting [ 21 ]. Feng et al proposed a supervised discriminative sparse principal component analysis [ 22 ] and a graph Laplacian sparse principal component analysis for dimensional reduction [ 23 ]…”
Section: Introductionmentioning
confidence: 99%
“…Most studies (e.g., SENSE (Pruessmann et al 1999), GRAPPA (Griswold et al 2002), SPIRiT (Lustig and Pauly 2010)) take advantage of spatial sensitivity and gradient coding to reduce the amount of data required for reconstruction, thereby shortening the imaging time. Moreover, compressed sensing (CS) is an important technique for fast MR image reconstruction (Feng et al 2019), which recovers the desired signal from k-space data sampled below the Nyquist rate. Typical CS-based approaches adopt a sparsity prior (Liu et al 2019), low-rank sparse sampling (He et al 2016;Haldar and Zhuo 2016), or manifold learning (Nakarmi et al 2017) for reconstruction.…”
Section: Introductionmentioning
confidence: 99%