2021
DOI: 10.48550/arxiv.2105.00309
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PREDICT: Persian Reverse Dictionary

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…(Yan et al, 2020) endeavored to integrate pre-trained models like BERT for cross-lingual capabilities. The most recent advancements, such as the Persian reverse dictionary by (Malekzadeh et al, 2021), maintain the momentum of NLP innovations in this realm. This evolution culminates in the CODWOE shared task's interest, which emphasizes the reconstruction of word embeddings from their definitions, a premise intimately linked to prior works.…”
Section: Reverse Dictionarymentioning
confidence: 99%
“…(Yan et al, 2020) endeavored to integrate pre-trained models like BERT for cross-lingual capabilities. The most recent advancements, such as the Persian reverse dictionary by (Malekzadeh et al, 2021), maintain the momentum of NLP innovations in this realm. This evolution culminates in the CODWOE shared task's interest, which emphasizes the reconstruction of word embeddings from their definitions, a premise intimately linked to prior works.…”
Section: Reverse Dictionarymentioning
confidence: 99%
“…In (Hedderich et al, 2019), they used attention mechanisms to integrate multi-sense embedding using LSTM and contextual word embedding (Bidirectional Encoder Representations from Transformers 1 https://www.onelook.com/thesaurus/ (BERT) to enhance performance in the reverse dictionary task. As for (Malekzadeh et al, 2021), they utilised different models to simulate the functionality of a reverse dictionary. These included a Bag of Words (BOW) model, an RNN model with additive attention, and a BiLSTM model.…”
Section: Related Workmentioning
confidence: 99%
“…Recent REVDICT approaches utilize deep learning (DL) to map arbitrary-length definition phrases to the vector representation of the target word (Hill et al, 2016;Malekzadeh et al, 2021;Qi et al, 2020;Yan et al, 2020). The success of DL approaches indicates that REVDICT can be solved implicitly, i.e.…”
Section: Related Workmentioning
confidence: 99%