2019
DOI: 10.1016/j.engappai.2019.07.010
|View full text |Cite
|
Sign up to set email alerts
|

A reproducible survey on word embeddings and ontology-based methods for word similarity: Linear combinations outperform the state of the art

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
71
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 76 publications
(85 citation statements)
references
References 78 publications
0
71
0
Order By: Relevance
“…Lastra‐Díaz et al stated that the collocation of word embeddings and ontologies resulted the best overall results on word similarity and relatedness according to their experiments (Lastra‐Díaz et al, ). Ali et al proposed an ontology and LDA‐based topic modelling and word embedding system to enhance the performance of document representation and sentiment classification (Ali et al, ).…”
Section: Discussionmentioning
confidence: 99%
“…Lastra‐Díaz et al stated that the collocation of word embeddings and ontologies resulted the best overall results on word similarity and relatedness according to their experiments (Lastra‐Díaz et al, ). Ali et al proposed an ontology and LDA‐based topic modelling and word embedding system to enhance the performance of document representation and sentiment classification (Ali et al, ).…”
Section: Discussionmentioning
confidence: 99%
“…Specialized clinical text embeddings have been used to improve clinical named entity recognition (75), resolve abbreviations in clinical text (76), expand a structured lexicon of radiology terms (77) and build a lexicon of dietary supplements (78). Second, an embedding can incorporate structured information beyond what is found in the text (79), and embeddings have been created to represent CUIs (80), documents (81,82), or entire patient records (83). Any task in which the notion of similarity is important, particularly when that similarity is based on patterns in text, can probably benefit from embeddings.…”
Section: Word Phrase and Character Embeddingsmentioning
confidence: 99%
“…Indeed for word embeddings there is an increase in methods to evaluate words representations intrinsically such as (Schnabel et al, 2015). All these techniques are thoroughly summarized by the survey (Lastra-Díaz et al, 2019). This is a fruitful direction, because having such evaluation metrics we can validate the knowledge acquired by all trained embedding spaces.…”
Section: Introductionmentioning
confidence: 99%