2017
DOI: 10.1038/srep43946
|View full text |Cite
|
Sign up to set email alerts
|

Enlightening discriminative network functional modules behind Principal Component Analysis separation in differential-omic science studies

Abstract: Omic science is rapidly growing and one of the most employed techniques to explore differential patterns in omic datasets is principal component analysis (PCA). However, a method to enlighten the network of omic features that mostly contribute to the sample separation obtained by PCA is missing. An alternative is to build correlation networks between univariately-selected significant omic features, but this neglects the multivariate unsupervised feature compression responsible for the PCA sample segregation. B… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
71
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2
1

Relationship

5
4

Authors

Journals

citations
Cited by 50 publications
(72 citation statements)
references
References 106 publications
(147 reference statements)
1
71
0
Order By: Relevance
“…Multivariate analysis, which considers together the features collected for describing the brain pathophysiological state of each mouse, was done using unsupervised and parameter‐free (the algorithms do not require the tuning of any internal parameters, and hence, they prevent bias and overfitting) machine learning algorithms for linear and nonlinear dimension reduction. We applied principal component analysis (PCA), which is a state‐of‐the‐art approach for linear dimension reduction (Ringner, ; Ciucci et al , ) and minimum curvilinear embedding (MCE), which is a nonlinear and parameter‐free kernel PCA whose efficacy was extensively tested in previous studies (Cannistraci et al , , ; Ammirati et al , ; Alanis‐Lobato et al , ; Alessio & Cannistraci, ; Sales et al , ). We adopted both PCA and MCE to have confirmation that the results displayed were substantiate regardless of the type of linear or nonlinear transformation employed for their analysis.…”
Section: Methodsmentioning
confidence: 99%
“…Multivariate analysis, which considers together the features collected for describing the brain pathophysiological state of each mouse, was done using unsupervised and parameter‐free (the algorithms do not require the tuning of any internal parameters, and hence, they prevent bias and overfitting) machine learning algorithms for linear and nonlinear dimension reduction. We applied principal component analysis (PCA), which is a state‐of‐the‐art approach for linear dimension reduction (Ringner, ; Ciucci et al , ) and minimum curvilinear embedding (MCE), which is a nonlinear and parameter‐free kernel PCA whose efficacy was extensively tested in previous studies (Cannistraci et al , , ; Ammirati et al , ; Alanis‐Lobato et al , ; Alessio & Cannistraci, ; Sales et al , ). We adopted both PCA and MCE to have confirmation that the results displayed were substantiate regardless of the type of linear or nonlinear transformation employed for their analysis.…”
Section: Methodsmentioning
confidence: 99%
“…Furthermore, we investigated the effect of PPI on the microbiota of gastric fluid and gastric mucosa in dyspeptic patients, and the changes induced by H. pylori infection on the gastric mucosal microbiota, by means of the PC-corr approach 62 . PC-corr represents a simple algorithm that associates to any PCA segregation a discriminative network of features’ interactions 62 .…”
Section: Methodsmentioning
confidence: 99%
“…We adopted PCA, which is a machine learning method for unsupervised linear and parameter-free dimension reduction. We performed unsupervised analysis instead of a supervised one, because it is less prone to overfitting as shown in previous studies [12], [17], [18]. 10,000 resampled datasets were generated from the original training set (P1-Partition 1), each of which was obtained by randomly selecting 200 mRBCs and 200 RETs.…”
Section: Unsupervised Dimension Reduction Machine Learning Proceduresmentioning
confidence: 99%
“…In fact, the area is by definition a function of x-size and y-size, and PC-corr successfully infers this from the data independently from the single discriminative power of each feature. This result is possible because PC-corr is a multivariate approach and offers results different from univariate analysis approaches (which test single features), as extensively discussed in the article of Ciucci et al [12].…”
Section: Validation Of the Designed Combinatorial Markersmentioning
confidence: 99%