2010
DOI: 10.1111/j.1551-6709.2010.01106.x
|View full text |Cite
|
Sign up to set email alerts
|

Composition in Distributional Models of Semantics

Abstract: Distributional models of semantics have proven themselves invaluable both in cognitive modelling of semantic phenomena and also in practical applications. For example, they have been used to model judgments of semantic similarity (McDonald, 2000) and association (Denhire and Lemaire, 2004;Griffiths et al., 2007) and have been shown to achieve human level performance on synonymy tests (Landuaer and Dumais, 1997;Griffiths et al., 2007) such as those included in the Test of English as Foreign Language (TOEFL). Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

10
755
2
4

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 692 publications
(782 citation statements)
references
References 162 publications
(338 reference statements)
10
755
2
4
Order By: Relevance
“…In that study, the authors obtained document vector representations by deploying simple composition functions (e.g. min, average, max) to construct vector representations of combinations of words, such as phrases or sentences, from term vector models [20]. They showed that these compositional document vectors could be effectively used as features to extend text classification and improve classification performance.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In that study, the authors obtained document vector representations by deploying simple composition functions (e.g. min, average, max) to construct vector representations of combinations of words, such as phrases or sentences, from term vector models [20]. They showed that these compositional document vectors could be effectively used as features to extend text classification and improve classification performance.…”
Section: Related Workmentioning
confidence: 99%
“…They showed that these compositional document vectors could be effectively used as features to extend text classification and improve classification performance. In this work, we follow the methodology of [19,20] and compose document representations from word embeddings in the task of sensitivity classification. However, differently from Balikas and Amini [19], we show how these document representations combined with text features can be effective for discovering latent sensitivities.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, researchers have begun to explore compositional distributional semantics, giving a distributional representation not only to words but also to phrases and even sentences (Mitchell and Lapata 2010;Coecke et al 2011;Socher et al 2012;Baroni et al 2014a;Pham 2016 among many others); the previous work we presented at the end of Sect. 3.2 falls into this line of research.…”
Section: Conceptually Afforded Composition With Distributional Semanticsmentioning
confidence: 99%
“…A very simple but stubbornly effective method is to simply add up the word vectors, as in Fig. 4 (Mitchell and Lapata 2010), but more sophisticated methods have been designed that sometimes yield better results (Baroni and Zamparelli 2010). Nothing we say in this paper depends on the chosen method for composition, hence we will simply use comp( ⃖⃖⃖⃖⃖ ⃗ , ⃖⃖⃖⃖⃖⃖⃖⃖⃖ ⃗) for the distributional representation of the phrase red moon obtained by applying a composition function to its constituent word vectors, ⃖⃖⃖⃖⃖ ⃗ and ⃖⃖⃖⃖⃖⃖⃖⃖⃖ ⃗ (we represent word vectors with an overhead arrow).…”
Section: Conceptually Afforded Composition With Distributional Semanticsmentioning
confidence: 99%
“…Co-occurrence statistics allow to quantitatively project some of a word's semantics grounded in users' categorization performance ( [18]). Collocations show the relative most frequent (sometimes stereotypical, implicit) social categories in communication, but the research must be complimented with concordance analysis for semantic complexity.…”
Section: Collocations and Concordance Analysismentioning
confidence: 99%