2019 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2019
DOI: 10.1109/bibm47256.2019.8983063
|View full text |Cite
|
Sign up to set email alerts
|

Supervised prediction of aging-related genes from a context-specific protein interaction subnetwork

Abstract: Human aging is linked to many prevalent diseases. The aging process is highly influenced by genetic factors. Hence, it is important to identify human aging-related genes. We focus on supervised prediction of such genes. Gene expression-based methods for this purpose study genes in isolation from each other. While protein-protein interaction (PPI) network-based methods for this purpose account for interactions between genes' protein products, current PPI network data are contextunspecific, spanning different bi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(14 citation statements)
references
References 36 publications
(67 reference statements)
1
13
0
Order By: Relevance
“…Some of our considered features have high dimensions, such as DGDV, GoT, and cSGDV. So, as in our previous work [17,18], for each feature, we consider: (i) the full feature (i.e., no dimensionality reduction), (ii) linear dimensionality reduction via principal component analysis (PCA) that considers as few principal components as needed to account for at least 90% of variation in the data corresponding to the given feature, and (iii)-(viii) nonlinear dimensionality reduction via t-distributed stochastic neighbor embedding (tSNE) under six different perplexity parameters (5,13,21,30,40,50). This totals to 1 + 1 + 6 = 8 considered dimensionality reduction choices.…”
Section: Eight Considered Feature Dimensionality Reduction Choicessupporting
confidence: 65%
See 4 more Smart Citations
“…Some of our considered features have high dimensions, such as DGDV, GoT, and cSGDV. So, as in our previous work [17,18], for each feature, we consider: (i) the full feature (i.e., no dimensionality reduction), (ii) linear dimensionality reduction via principal component analysis (PCA) that considers as few principal components as needed to account for at least 90% of variation in the data corresponding to the given feature, and (iii)-(viii) nonlinear dimensionality reduction via t-distributed stochastic neighbor embedding (tSNE) under six different perplexity parameters (5,13,21,30,40,50). This totals to 1 + 1 + 6 = 8 considered dimensionality reduction choices.…”
Section: Eight Considered Feature Dimensionality Reduction Choicessupporting
confidence: 65%
“…• CentraMV, centrality mean and variation [17,18], of a node v measures, for each of the four considered centrality-based features (GDC, ECC, KC, and DegC), two quantities: the mean and variation over v's 37 centrality values. These two quantities for each of the four centrality-based features combined form an 8-dimensional CentraMV node feature.…”
Section: Seven Considered Dynamic Featuresmentioning
confidence: 99%
See 3 more Smart Citations