2015
DOI: 10.1016/j.lingua.2014.12.001
|View full text |Cite
|
Sign up to set email alerts
|

Monitoring polysemy: Word space models as a tool for large-scale lexical semantic analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
28
0
3

Year Published

2016
2016
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 66 publications
(37 citation statements)
references
References 11 publications
0
28
0
3
Order By: Relevance
“…DS is an active and lively research area in semantics, addressing a wide range of topics related to meaning. In addition to those analyzed in this review, further research issues in which DS is producing very interesting results include (a) the development of multimodal DSMs (Feng & Lapata 2010, Bruni et al 2014) that integrate corpus-derived features and features extracted from images, which are also used to explore the interplay between linguistic and experiential information; (b) the study of polysemy, which uses DSMs to induce and represent difference senses from the distributional properties of lexical items (Schütze 1997, Heylen et al 2015; and (c) the analysis of semantic change, which involves applying DS to diachronic corpora (Hamilton et al 2016, Rodda et al 2017 and investigating the change in productivity of syntactic constructions (Perek 2016). DS is a framework for semantic analysis that can provide new answers to classical semantic questions, as well as address problems that have often been ignored by other models of the lexicon.…”
Section: Conclusion and Future Challengesmentioning
confidence: 99%
“…DS is an active and lively research area in semantics, addressing a wide range of topics related to meaning. In addition to those analyzed in this review, further research issues in which DS is producing very interesting results include (a) the development of multimodal DSMs (Feng & Lapata 2010, Bruni et al 2014) that integrate corpus-derived features and features extracted from images, which are also used to explore the interplay between linguistic and experiential information; (b) the study of polysemy, which uses DSMs to induce and represent difference senses from the distributional properties of lexical items (Schütze 1997, Heylen et al 2015; and (c) the analysis of semantic change, which involves applying DS to diachronic corpora (Hamilton et al 2016, Rodda et al 2017 and investigating the change in productivity of syntactic constructions (Perek 2016). DS is a framework for semantic analysis that can provide new answers to classical semantic questions, as well as address problems that have often been ignored by other models of the lexicon.…”
Section: Conclusion and Future Challengesmentioning
confidence: 99%
“…Thus, the (bold) claim of this paper is that no specialised lexical semantic rules are required to account for systematic polysemy; instead, the phenomenon has its origin in the operation of more general lexical pragmatic processes. It should be noted that recent computational semantic approaches, particularly work on word meaning in distributional semantics (for overviews, see Clark 2015, Erk 2012, abandon lexical rules in favour of a more empirically oriented approach where large-scale corpus analyses are used to create predictive models for the distribution of a word's (attested) senses (e.g., Boleda et al 2012, Heylen et al 2015. While this shift of focus from stipulated lexical rules to models based on actual word usage represents a promising development within computational lexical semantics, there are still many unresolved (and underexplored) issues when it comes to accounting for systematic polysemy.…”
Section: Introductionmentioning
confidence: 99%
“…This paper showcases token-based semantic vector spaces (Schütze 1998;Heylen et al 2012Heylen et al , 2015 as a tool for corpus-linguistic analyses. More specifically, it is the aim of this paper to demonstrate how this technique can be applied to linguistic research questions that address theoretical claims.…”
Section: Introductionmentioning
confidence: 99%