Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations 2020
DOI: 10.18653/v1/2020.emnlp-demos.23
|View full text |Cite
|
Sign up to set email alerts
|

WantWords: An Open-source Online Reverse Dictionary System

Abstract: A reverse dictionary takes descriptions of words as input and outputs words semantically matching the input descriptions. Reverse dictionaries have great practical value such as solving the tip-of-the-tongue problem and helping new language learners. There have been some online reverse dictionary systems, but they support English reverse dictionary queries only and their performance is far from perfect. In this paper, we present a new open-source online reverse dictionary system named WantWords (https://wantwo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(17 citation statements)
references
References 15 publications
0
17
0
Order By: Relevance
“…Finally, the ranks of our team are given (and the number of teams competing in a subtask). (9) 1 (9) 1 (9) 3 (7) 2 (7) 2 (7) 3 (6) 1 (6) 1 (6) 3 (7) 1 (7) 1 (7) 4 (6) 2 (6) 1 ( 7) 7 ( 7) 1 (7) 3 ( 5) 5 ( 5) 1 (5) 3 ( 5) 4 ( 5) 2 ( 5) 5 ( 6) 5 ( 6) 1 (6) 3 ( 5) 4 ( 5 (Bevilacqua et al, 2020;Gadetsky et al, 2018;Mickus et al, 2019;Noraset et al, 2017;Yang et al, 2020;Zhu et al, 2019). Additionally, these approaches make use of the pre-trained word embeddings that carry the semantic information extracted from a huge corpus.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Finally, the ranks of our team are given (and the number of teams competing in a subtask). (9) 1 (9) 1 (9) 3 (7) 2 (7) 2 (7) 3 (6) 1 (6) 1 (6) 3 (7) 1 (7) 1 (7) 4 (6) 2 (6) 1 ( 7) 7 ( 7) 1 (7) 3 ( 5) 5 ( 5) 1 (5) 3 ( 5) 4 ( 5) 2 ( 5) 5 ( 6) 5 ( 6) 1 (6) 3 ( 5) 4 ( 5 (Bevilacqua et al, 2020;Gadetsky et al, 2018;Mickus et al, 2019;Noraset et al, 2017;Yang et al, 2020;Zhu et al, 2019). Additionally, these approaches make use of the pre-trained word embeddings that carry the semantic information extracted from a huge corpus.…”
Section: Resultsmentioning
confidence: 99%
“…Subsequent work on Definition Modeling focused on variations of the problem of prediction of a word gloss from the word sense. These approaches consider gloss prediction based on sensespecific word embeddings (Gadetsky et al, 2018;Kabiri and Cook, 2020;Zhu et al, 2019), and on a word-based context indicating the word sense (Bevilacqua et al, 2020;Gadetsky et al, 2018;Mickus et al, 2019;Yang et al, 2020;. The proposed approaches are based either on RNNs (Gadetsky et al, 2018;Kabiri and Cook, 2020;Zhu et al, 2019) or Transformers (Bevilacqua et al, 2020;Mickus et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Since then, a number of works have attempted to implement reverse dictionaries using neural language models. The WantWords system (Zhang et al, 2020b;Qi et al, 2020) is based on a BiLSTM architecture, and incorporates auxiliary tasks such as part-of-speech prediction to boost the performance. Yan et al (2020) seeks to replace the learned neural language models in Hill et al (2016) or WantWords with a pre-trained model such as BERT (Devlin et al, 2019) and its multilingual variants, which allows them to use their system in a cross-lingual setting-querying in a language to obtain an answer in another.…”
Section: Track 2: Reverse Dictionarymentioning
confidence: 99%