2019
DOI: 10.1609/aaai.v33i01.33016212
|View full text |Cite
|
Sign up to set email alerts
|

AutoSense Model for Word Sense Induction

Abstract: Word sense induction (WSI), or the task of automatically discovering multiple senses or meanings of a word, has three main challenges: domain adaptability, novel sense detection, and sense granularity flexibility. While current latent variable models are known to solve the first two challenges, they are not flexible to different word sense granularities, which differ very much among words, from aardvark with one sense, to play with over 50 senses. Current models either require hyperparameter tuning or nonparam… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…The relevance of graphical approaches is proved by the fact that increasingly sophisticated graphical models dominated the state-of-the-art results of WSI up until recently (e.g. Lau, Cook & Baldwin 2013;Wang et al 2015;Komninos & Manandhar 2016;Amplayo, Hwang & Song 2019;cf. Amrami & Goldberg 2019).…”
Section: International and Hungarian Backgroundmentioning
confidence: 99%
“…The relevance of graphical approaches is proved by the fact that increasingly sophisticated graphical models dominated the state-of-the-art results of WSI up until recently (e.g. Lau, Cook & Baldwin 2013;Wang et al 2015;Komninos & Manandhar 2016;Amplayo, Hwang & Song 2019;cf. Amrami & Goldberg 2019).…”
Section: International and Hungarian Backgroundmentioning
confidence: 99%
“…Here, the definition of context may vary from window-based context to latent topic-alike context. Afterwards, the resulting clusters are either used as senses directly (Kutuzov, 2018), or employed further to learn sense embeddings via Chinese Restaurant Process algorithm (Li and Jurafsky, 2015), AdaGram, a Bayesian extension of the Skip-Gram model (Bartunov et al, 2016), AutoSense, an extension of the LDA topic model (Amplayo et al, 2019), and other techniques.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, [13] relies on the Hierarchical Dirichlet Process and [6] employs the Stick-breaking Process. In [2] a rather complicated custom graphical model is proposed which aims at solving the sense granularity problem. Graph Clustering methods like [21,9] first build a graph with nodes corresponding to words, and weighted edges representing semantic similarity strength or co-occurrence frequency.…”
Section: Related Workmentioning
confidence: 99%