2022
DOI: 10.3390/molecules27144371
|View full text |Cite
|
Sign up to set email alerts
|

TLNPMD: Prediction of miRNA-Disease Associations Based on miRNA-Drug-Disease Three-Layer Heterogeneous Network

Abstract: Many microRNAs (miRNAs) have been confirmed to be associated with the generation of human diseases. Capturing miRNA–disease associations (M-DAs) provides an effective way to understand the etiology of diseases. Many models for predicting M-DAs have been constructed; nevertheless, there are still several limitations, such as generally considering direct information between miRNAs and diseases, usually ignoring potential knowledge hidden in isolated miRNAs or diseases. To overcome these limitations, in this stud… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 34 publications
(36 reference statements)
0
1
0
Order By: Relevance
“…After obtaining all the similarity and association matrix, researchers implemented network-based algorithms such as Random Walk with Restart [16] or KATZ method [19] to get predicted disease-related miRNAs/lncRNAs [20]. There are other recent deep learning based methods like TLNPMD [21], MINIMIDA [22], etc. that utilized convolutional network for feature extraction and association inference, but they suffer from high computational cost due to large trainable parameters.…”
Section: Introductionmentioning
confidence: 99%
“…After obtaining all the similarity and association matrix, researchers implemented network-based algorithms such as Random Walk with Restart [16] or KATZ method [19] to get predicted disease-related miRNAs/lncRNAs [20]. There are other recent deep learning based methods like TLNPMD [21], MINIMIDA [22], etc. that utilized convolutional network for feature extraction and association inference, but they suffer from high computational cost due to large trainable parameters.…”
Section: Introductionmentioning
confidence: 99%