Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017) 2017
DOI: 10.18653/v1/k17-1036
|View full text |Cite
|
Sign up to set email alerts
|

German in Flux: Detecting Metaphoric Change via Word Entropy

Abstract: This paper explores the informationtheoretic measure entropy to detect metaphoric change, transferring ideas from hypernym detection to research on language change. We also build the first diachronic test set for German as a standard for metaphoric change annotation. Our model shows high performance, is unsupervised, language-independent and generalizable to other processes of semantic change.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(22 citation statements)
references
References 34 publications
(42 reference statements)
0
22
0
Order By: Relevance
“…Evaluation Due to a lack of proper evaluation methods and datasets, all papers above have performed different, non-comparable evaluations. Previous evaluation procedures mainly tackle a few words: case studies of individual words (Wijaya and Yeniterzi, 2011;Jatowt and Duh, 2014;Hamilton et al, 2016a), or a comparison between a few changing and semantically stable words (Lau et al, 2012;Schlechtweg et al, 2017). Other works focus on the post hoc evaluation of their respective models (Kulkarni et al, 2015;Eger and Mehler, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…Evaluation Due to a lack of proper evaluation methods and datasets, all papers above have performed different, non-comparable evaluations. Previous evaluation procedures mainly tackle a few words: case studies of individual words (Wijaya and Yeniterzi, 2011;Jatowt and Duh, 2014;Hamilton et al, 2016a), or a comparison between a few changing and semantically stable words (Lau et al, 2012;Schlechtweg et al, 2017). Other works focus on the post hoc evaluation of their respective models (Kulkarni et al, 2015;Eger and Mehler, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…VH is based on Shannon's entropy (Shannon, 1948), which measures the unpredictability of w's co-occurrences (Schlechtweg et al, 2017). HD is defined as…”
Section: Dispersion Measuresmentioning
confidence: 99%
“…In (i), semantic vector spaces, each word is represented as two vectors reflecting its co-occurrence statistics at different periods of time (Gulordava and Baroni, 2011;Kim et al, 2014;Xu and Kemp, 2015;Eger and Mehler, 2016;Hamilton et al, 2016a,b;Hellrich and Hahn, 2016;Rosenfeld and Erk, 2018). LSC is typically measured by the cosine distance (or some alternative metric) between the two vectors, or by differences in contextual dispersion between the two vectors (Kisselew et al, 2016;Schlechtweg et al, 2017). (ii) Diachronic topic models infer a probability distribution for each word over different word senses (or topics), which are in turn modeled as a distribution over words (Wang and McCallum, 2006;Bamman and Crane, 2011;Wijaya and Yeniterzi, 2011;Lau et al, 2012;Mihalcea and Nastase, 2012;Cook et al, 2014;Frermann and Lapata, 2016).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This research broadly falls into two categories. On the one hand, methods proposals and critiques accompanied by exploratory results (Dubossarsky et al, 2017;Frermann and Lapata, 2016;Gulordava and Baroni, 2011;Hamilton et al, 2016b;Jatowt and Duh, 2014;Kulkarni et al, 2015;Sagi et al, 2011;Schlechtweg et al, 2017;Wijaya and Yeniterzi, 2011). On the other, applications of these methods, usually with more specific linguistic questions in mind (Dautriche et al, 2016;Dubossarsky et al, 2016;Hamilton et al, 2016a;Perek, 2016;Rodda et al, 2016;Xu and Kemp, 2015).…”
Section: Previous Researchmentioning
confidence: 99%