ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2023
DOI: 10.1109/icassp49357.2023.10095526
|View full text |Cite
|
Sign up to set email alerts
|

Extended Expectation Maximization for Under-Fitted Models

Abstract: In this paper, we generalize the well-known Expectation Maximization (EM) algorithm using the α−divergence for Gaussian Mixture Model (GMM). This approach is used in robust subspace detection when the number of parameters is kept small to avoid overfitting and large estimation variances. The level of robustness can be tuned by the parameter α. When α → 1, our method is equivalent to the standard EM approach and for α < 1 the method is robust against potential outliers. Simulation results show that the method o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…The number of radial neurons then needs to be established. This task can be performed, e.g., through the use of clustering algorithms, such as Growing Neural Gas (GNG) [57] or Expectation Maximization (EM) [58] algorithms. The output of each of these neurons is dependent on two parameters: the Euclidean distance between the input variable and its center vector, and the width of the function.…”
Section: Principles Of Operation Of Radial Basis Function Neural Networkmentioning
confidence: 99%
“…The number of radial neurons then needs to be established. This task can be performed, e.g., through the use of clustering algorithms, such as Growing Neural Gas (GNG) [57] or Expectation Maximization (EM) [58] algorithms. The output of each of these neurons is dependent on two parameters: the Euclidean distance between the input variable and its center vector, and the width of the function.…”
Section: Principles Of Operation Of Radial Basis Function Neural Networkmentioning
confidence: 99%