Interspeech 2021 2021
DOI: 10.21437/interspeech.2021-1204
|View full text |Cite
|
Sign up to set email alerts
|

Coded Speech Enhancement Using Neural Network-Based Vector-Quantized Residual Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…In this work, we combine the merit from both FP-and BP-QAT and propose General Quantizer (GQ) that navigates weights to quantization centroids without introducing augmented regularizers but via feedforward-only operators. Our work is inspired by a continuous relaxation of quantization [25] also used for speech representation learning [26,27,28,29,30,31,32], and µ-Law algorithm for 8-bit pulse-code modulation (PCM) digital telecommunication [33].…”
Section: Related Qat Approachesmentioning
confidence: 99%
“…In this work, we combine the merit from both FP-and BP-QAT and propose General Quantizer (GQ) that navigates weights to quantization centroids without introducing augmented regularizers but via feedforward-only operators. Our work is inspired by a continuous relaxation of quantization [25] also used for speech representation learning [26,27,28,29,30,31,32], and µ-Law algorithm for 8-bit pulse-code modulation (PCM) digital telecommunication [33].…”
Section: Related Qat Approachesmentioning
confidence: 99%