2020
DOI: 10.2196/16948
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Deep Learning: Prior Knowledge and a Type of Four-Term Embedding Analogy to Acquire Treatments for Well-Known Diseases

Abstract: Background How to treat a disease remains to be the most common type of clinical question. Obtaining evidence-based answers from biomedical literature is difficult. Analogical reasoning with embeddings from deep learning (embedding analogies) may extract such biomedical facts, although the state-of-the-art focuses on pair-based proportional (pairwise) analogies such as man:woman::king:queen (“queen = −man +king +woman”). Objective This study aimed to sy… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 43 publications
0
2
0
Order By: Relevance
“…CBOW (Continuous Bag-Of-Words) model in word to vector (word2vec) technique that proposed by Mikolov et al [22] used to generate sentence to vector (sent2vec). CBOW predicts the probability of word by word within a specific size window [23]. This study used four windows and 400 dimensions.…”
Section: Sentence To Vectormentioning
confidence: 99%
“…CBOW (Continuous Bag-Of-Words) model in word to vector (word2vec) technique that proposed by Mikolov et al [22] used to generate sentence to vector (sent2vec). CBOW predicts the probability of word by word within a specific size window [23]. This study used four windows and 400 dimensions.…”
Section: Sentence To Vectormentioning
confidence: 99%
“…For instance, understanding lexical relations is an important prerequisite for understanding the meaning of compound nouns . Moreover, the ability of word vectors to capture semantic relations has enabled a wide range of applications beyond NLP, including flexible querying of relational databases (Bordawekar and Shmueli, 2017), schema match-1 Source code to reproduce our experimental results and the model checkpoints are available in the following repository: https://github.com/asahi417/relbert ing (Fernandez et al, 2018), completion and retrieval of Web tables (Zhang et al, 2019), ontology completion (Bouraoui and Schockaert, 2019) and information retrieval in the medical domain (Arguello Casteleiro et al, 2020). More generally, relational similarity (or analogy) plays a central role in computational creativity (Goel, 2019), legal reasoning (Ashley, 1988;Walton, 2010), ontology alignment (Raad and Evermann, 2015) and instance-based learning (Miclet et al, 2008).…”
Section: Introductionmentioning
confidence: 99%