2021
DOI: 10.1186/s12859-021-04346-7
|View full text |Cite
|
Sign up to set email alerts
|

Mining microbe–disease interactions from literature via a transfer learning model

Abstract: Background Interactions of microbes and diseases are of great importance for biomedical research. However, large-scale of microbe–disease interactions are hidden in the biomedical literature. The structured databases for microbe–disease interactions are in limited amounts. In this paper, we aim to construct a large-scale database for microbe–disease interactions automatically. We attained this goal via applying text mining methods based on a deep learning model with a moderate curation cost. We… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 50 publications
0
12
0
Order By: Relevance
“…We also did not compare the quality of the networks with existing hand curated manual efforts in this direction, it will interesting future work to perform a data driven comparison to assess the performance. Additionally, it would be interesting to use these embeddings to infer causality or to infer positive or negative correlations between concepts (such as in Wu et al (2021) ). Additionally, in this study we only established the quality of the learnt semantic relationships empirically since our focus was only to establish the potential of text mining algorithms to derive such relationships with no pre-training or domain understanding, a more rigorous analysis of the networks is interesting future work.…”
Section: Discussionmentioning
confidence: 99%
“…We also did not compare the quality of the networks with existing hand curated manual efforts in this direction, it will interesting future work to perform a data driven comparison to assess the performance. Additionally, it would be interesting to use these embeddings to infer causality or to infer positive or negative correlations between concepts (such as in Wu et al (2021) ). Additionally, in this study we only established the quality of the learnt semantic relationships empirically since our focus was only to establish the potential of text mining algorithms to derive such relationships with no pre-training or domain understanding, a more rigorous analysis of the networks is interesting future work.…”
Section: Discussionmentioning
confidence: 99%
“…Where parameter transfer is already commonly used in NLP tasks, it is assumed to share some parameters between source tasks and target tasks, or to share a prior distribution of model hyperparameters [10]. This also enables good accuracy when transferring the original model to the new domain [11,12]. However, there are also problems with negative transfer.…”
Section: Transfer Learningmentioning
confidence: 99%
“…Next is the work by Wu et al[ 24 ] that focuses on a deep-learning strategy for solving this problem. Their approach first involves preparing training datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Secondly, our approach highlights the relevance of deep learning and transformer models in reducing the requirement for large amounts of training data. In contrast to the study by [ 24 ] or other deep learning models, our method benefits from transfer learning and task-specific fine-tuning. This advantage enables us to achieve excellent results with a smaller amount of training data, making our approach efficient and practical.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation