2020
DOI: 10.1609/aaai.v34i05.6350
|View full text |Cite
|
Sign up to set email alerts
|

Embedding Compression with Isotropic Iterative Quantization

Abstract: Continuous representation of words is a standard component in deep learning-based NLP models. However, representing a large vocabulary requires significant memory, which can cause problems, particularly on resource-constrained platforms. Therefore, in this paper we propose an isotropic iterative quantization (IIQ) approach for compressing embedding vectors into binary ones, leveraging the iterative quantization technique well established for image retrieval, while satisfying the desired isotropic property of P… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…One could examine the approach by Kim et al (2020), where they learn NLP task-specific features and compress embeddings accordingly. A different approach by Liao et al (2020) used quantization and dimensionality reduction to optimize the encoding of word embeddings, while preserving relevant information.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…One could examine the approach by Kim et al (2020), where they learn NLP task-specific features and compress embeddings accordingly. A different approach by Liao et al (2020) used quantization and dimensionality reduction to optimize the encoding of word embeddings, while preserving relevant information.…”
Section: Discussionmentioning
confidence: 99%
“…Word embeddings are an integral part in deep learning PoS taggers, but large. Several works have explored embedding compression (Chen et al, 2016;Tissier et al, 2019;Kim et al, 2020;Melas-Kyriazi et al, 2020;Liao et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…An isotropic iterative quantization (IIQ) method is suggested for compacting embedding feature vectors into binary ones to satisfy the required isotropic property of pointwise mutual information (PMI)-based approaches. This approach uses the iterative quantization technique, which is well-established for image retrieval (Liao et al 2020). A method for obtaining vector representations of noun phrases is suggested.…”
Section: Importance Of Word Embeddingmentioning
confidence: 99%