Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.106
|View full text |Cite
|
Sign up to set email alerts
|

Learning Efficient Task-Specific Meta-Embeddings with Word Prisms

Abstract: Word embeddings are trained to predict word cooccurrence statistics, which leads them to possess different lexical properties (syntactic, semantic, etc.) depending on the notion of context defined at training time. These properties manifest when querying the embedding space for the most similar vectors, and when used at the input layer of deep neural networks trained to solve downstream NLP problems. Meta-embeddings combine multiple sets of differently trained word embeddings, and have been shown to successful… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(11 citation statements)
references
References 35 publications
0
11
0
Order By: Relevance
“…Intrinsic Tasks Extrinsic Tasks (Kiela et al, 2018) SST, SNLI, Image Caption (1) (He et al, 2020) SST2, SNLI, NER (1), POS(1), Semcor (Bollegala and Bao, 2018) Sim. (4), An.…”
Section: Papermentioning
confidence: 99%
See 2 more Smart Citations
“…Intrinsic Tasks Extrinsic Tasks (Kiela et al, 2018) SST, SNLI, Image Caption (1) (He et al, 2020) SST2, SNLI, NER (1), POS(1), Semcor (Bollegala and Bao, 2018) Sim. (4), An.…”
Section: Papermentioning
confidence: 99%
“…This requires minimal bilingual supervision while still leveraging large amounts of monolingual corpora with very competitive results (Artetxe et al, 2016(Artetxe et al, , 2018. These techniques are used by Doval et al (2018); García-Ferrero et al (2020); Jawanpuria et al (2020); He et al (2020) to generate meta-embeddings. This usually involves mapping all the source embeddings to a common vector space followed by averaging.…”
Section: Papermentioning
confidence: 99%
See 1 more Smart Citation
“…Given independently trained multiple word representations (aka embeddings) learnt using diverse algorithms and lexical resources, word meta-embedding (ME) learning methods [Yin and Schütze, 2016;Bao and Bollegala, 2018;Bollegala et al, 2018a;Wu et al, 2020;He et al, 2020;Jawanpuria et al, 2020;Coates and Bollegala, 2018] attempt to learn more accurate and wide-coverage word embeddings. The input and output word embeddings to the ME algorithm are referred respectively as the source and meta-embeddings.…”
Section: Introductionmentioning
confidence: 99%
“…Prior work [Yin and Shen, 2018;Levy et al, 2015] studying word embeddings have shown that the performance of a static word embedding is directly influenced by its dimensionality. ME learning methods use different techniques such as concatenation [Yin and Schütze, 2016], orthogonal projections [Jawanpuria et al, 2020;He et al, 2020] and averaging [Coates and Bollegala, 2018] after applying zero-padding to the sources with smaller dimensionalities as necessary to handle source embeddings with different dimensionalities.…”
Section: Introductionmentioning
confidence: 99%