2019
DOI: 10.1177/1745691619861372
|View full text |Cite
|
Sign up to set email alerts
|

Vector-Space Models of Semantic Representation From a Cognitive Perspective: A Discussion of Common Misconceptions

Abstract: Models that represent meaning as high-dimensional numerical vectors—such as latent semantic analysis (LSA), hyperspace analogue to language (HAL), bound encoding of the aggregate language environment (BEAGLE), topic models, global vectors (GloVe), and word2vec—have been introduced as extremely powerful machine-learning proxies for human semantic representations and have seen an explosive rise in popularity over the past 2 decades. However, despite their considerable advancements and spread in the cognitive sci… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

5
254
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 212 publications
(259 citation statements)
references
References 178 publications
(394 reference statements)
5
254
0
Order By: Relevance
“…Research in distributed representations, when combined with an approach to machine-learning called artificial neural networks (ANNs), has yielded a strategy to model the meaning encoded in language, called word2vec. Word2vec also stands out from other computational methods to analyze text for its performance on linguistic tasks, such as recognizing analogies and synonyms, and because it mirrors aspects of human cognition (Günther et al, 2019).…”
Section: Modeling Meaning With Artificial Neural Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…Research in distributed representations, when combined with an approach to machine-learning called artificial neural networks (ANNs), has yielded a strategy to model the meaning encoded in language, called word2vec. Word2vec also stands out from other computational methods to analyze text for its performance on linguistic tasks, such as recognizing analogies and synonyms, and because it mirrors aspects of human cognition (Günther et al, 2019).…”
Section: Modeling Meaning With Artificial Neural Networkmentioning
confidence: 99%
“…Since word2vec with CBOW learns word-vectors by predicting words given their contexts, words that occur in similar contexts are represented with similar word-vectors. 8 Word2vec is theoretically motivated by the linguistic 5 Equivalently, to the activation of the hidden layer triggered by a particular words, (Günther et al, 2019). 6 In a word2vec model, the N specific basis vectors ("axes") picked out by the learning process are arbitrary, but the word embeddings encode a (latent) coordinate system in which dimensions correspond to salient dimensions of language.…”
Section: Word2vecmentioning
confidence: 99%
See 3 more Smart Citations