2021
DOI: 10.5715/jnlp.28.235
|View full text |Cite
|
Sign up to set email alerts
|

Compact Word Embeddings Based on Global Similarity

Abstract: We reduce the model size of word embeddings while preserving its quality. Previous studies composed word embeddings from those of subwords and mimicked the pretrained word embeddings. Although these methods can reduce the vocabulary size, it is difficult to extremely reduce the model size while preserving its quality. Inspired by the observation of words with similar meanings having similar embeddings, we propose a multitask learning that mimicks not only the pre-trained word embeddings but also the similarity… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 22 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?