2016 IEEE International Conference on Big Data (Big Data) 2016
DOI: 10.1109/bigdata.2016.7840675
|View full text |Cite
|
Sign up to set email alerts
|

Towards understanding word embeddings: Automatically explaining similarity of terms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 25 publications
0
8
0
Order By: Relevance
“…Given a keyword, for example, the name of a drug, this method formulated a feature vector that best predicts a window of surrounding words that occur in some meaningful context. Such semantic similarity also conforms to the important criteria for selecting good word pairs [23].…”
Section: Experimental Designmentioning
confidence: 56%
“…Given a keyword, for example, the name of a drug, this method formulated a feature vector that best predicts a window of surrounding words that occur in some meaningful context. Such semantic similarity also conforms to the important criteria for selecting good word pairs [23].…”
Section: Experimental Designmentioning
confidence: 56%
“…Given a keyword, for example, the drug name, this method formulated a feature vector that best predicts a window of surrounding words that occur in some meaningful context. Such semantic similarity also conforms to the important criteria for selecting good word pairs ( [34]) When training the dataset, the parameters required by word2Vec were the word frequency (the minimum number of times a word must appear in the corpus), layer size (the number of desired features in the word vector) and window size (the number of words before and after the word to extract for the training sample).…”
Section: Word Embedding Modelmentioning
confidence: 86%
“…Recent work suggests that intrinsic and extrinsic measures correlate poorly with one another (Schnabel et al, 2015;Gladkova and Drozd, 2016;Zhang et al, 2016). In many cases we want an embedding not just to capture relationships within the data, but also to do so in a way which can be usefully applied.…”
Section: Embedding Evaluationmentioning
confidence: 99%