2020
DOI: 10.48550/arxiv.2009.04972
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ICASSP 2021 Acoustic Echo Cancellation Challenge: Datasets, Testing Framework, and Results

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(22 citation statements)
references
References 0 publications
0
22
0
Order By: Relevance
“…The RES thus requires 4.6% of an x86 mobile CPU core (Intel i7-8565U) to operate in real-time. When combined with the AEC, the total complexity of the proposed 16 kHz echo control solution as submitted to the AEC challenge [17] is 5.5% CPU (0.55 ms per 10-ms frame). Since the RES is already designed to operate at 48 kHz, the total cost of fullband echo control only increases to 6.6%, with the difference due to the increased AEC sampling rate.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The RES thus requires 4.6% of an x86 mobile CPU core (Intel i7-8565U) to operate in real-time. When combined with the AEC, the total complexity of the proposed 16 kHz echo control solution as submitted to the AEC challenge [17] is 5.5% CPU (0.55 ms per 10-ms frame). Since the RES is already designed to operate at 48 kHz, the total cost of fullband echo control only increases to 6.6%, with the difference due to the increased AEC sampling rate.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Our model is trained (Section 5) to enhance the speech from the AEC using the far-end signal as side information to help remove the far-end signal while denoising the near-end speech. Results from our experiments and from the Acoustic Echo Cancellation Challenge [17] show that the proposed algorithm outperforms both traditional and other neural approaches to residual echo suppression, taking first place in the challenge (Section 6).…”
Section: Introductionmentioning
confidence: 86%
See 1 more Smart Citation
“…We train and test our optimizers using the synthetic portion of the Microsoft AEC Challenge dataset [30]. This dataset includes farend noise, near-end noise, and far-end nonlinearities.…”
Section: Datasetmentioning
confidence: 99%
“…To demonstrate our approach, we learn to optimize an AEC task as shown in Fig.1for a single-talk in noise scenario. We use the Microsoft AEC Challenge dataset[30] to learn update rules for a variety of common linear and nonlinear multidelayed block frequency domain filters (MDF)[31]. We compare our results to handengineered, grid-search-tuned block NLMS and RMSprop[32] optimizers, as well as the open-source Speex AEC[8,33].…”
mentioning
confidence: 99%