ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053671
|View full text |Cite
|
Sign up to set email alerts
|

Data Augmentation Using Empirical Mode Decomposition on Neural Networks to Classify Impact Noise in Vehicle

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…Guided Warping [48] SFM [56] Hybrid GAN [97] EMD [103] Flipping [41] Random Warping [21] Time Stretching [54] STFT [52] FT Perturbation [61] Linear Model [82] Markov Chain [83] Autoencoder [88] Time Aligned Averaging Fig. 2.…”
Section: Interpolationmentioning
confidence: 99%
See 1 more Smart Citation
“…Guided Warping [48] SFM [56] Hybrid GAN [97] EMD [103] Flipping [41] Random Warping [21] Time Stretching [54] STFT [52] FT Perturbation [61] Linear Model [82] Markov Chain [83] Autoencoder [88] Time Aligned Averaging Fig. 2.…”
Section: Interpolationmentioning
confidence: 99%
“…These features can either be used independently, recombined, or perturbed for generating new data for augmentation. Empirical Mode Decomposition (EMD) [102] is a method of decomposing nonlinear and non-stationary signals and it has shown to improve classification by using it as a decomposition method for data augmentation of noisy automobile sensor data in a CNN-LSTM [103]. Another example of a decomposition method used for data augmentation was proposed in [27], where the use of Independent Component Analysis (ICA) [104] was combined with a dynamical-functional artificial neural network (D-FANN) for filling gaps in time series.…”
Section: Time Series Decompositionmentioning
confidence: 99%
“…In summary, obtaining high-quality audio data to satisfy data-driven supervised or semi-supervised learning is particularly challenging, as it requires careful control of multiple factors to ensure data quality and prevent model overfitting. Therefore, data augmentation plays a crucial role in optimizing the dataset and reducing model overfitting [20].…”
Section: Introductionmentioning
confidence: 99%