2019
DOI: 10.3934/ipi.2019030
|View full text |Cite
|
Sign up to set email alerts
|

A dual EM algorithm for TV regularized Gaussian mixture model in image segmentation

Abstract: A dual expectation-maximization (EM) algorithm for total variation (TV) regularized Gaussian mixture model (GMM) is proposed in this paper. The algorithm is built upon the EM algorithm with TV regularization (EM-TV) model which combines the statistical and variational methods together for image segmentation. Inspired by the projection algorithm proposed by Chambolle, we give a dual algorithm for the EM-TV model. The related dual problem is smooth and can be easily solved by a projection gradient method, which … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 36 publications
(62 reference statements)
0
2
0
Order By: Relevance
“…The previous models are tested on gray and color images. We make comparisons between the weighted bounded Hessian variational (WBH) model [38], iterative convolution-thresholding method (ICTM) [33], dual expectation-maximization TV (EMTV) algorithm [37], continuous max-flow (CMF) method [41], ρ 1 (s) = s (TV), ρ 2 (s) = s p (0 < p < 1) (TV p ) [20], ρ 3 (s) = ln(θs + 1) (LN) and ρ 4 (s) = θs θs+1 (FRAC) and corresponding truncation versions, TR-TV, TR-TV p , TR-LN and TR-FRAC. We first give the following remarks regarding to the experiments:…”
Section: 3mentioning
confidence: 99%
“…The previous models are tested on gray and color images. We make comparisons between the weighted bounded Hessian variational (WBH) model [38], iterative convolution-thresholding method (ICTM) [33], dual expectation-maximization TV (EMTV) algorithm [37], continuous max-flow (CMF) method [41], ρ 1 (s) = s (TV), ρ 2 (s) = s p (0 < p < 1) (TV p ) [20], ρ 3 (s) = ln(θs + 1) (LN) and ρ 4 (s) = θs θs+1 (FRAC) and corresponding truncation versions, TR-TV, TR-TV p , TR-LN and TR-FRAC. We first give the following remarks regarding to the experiments:…”
Section: 3mentioning
confidence: 99%
“…Gaussian mixture model (GMM) is one powerful framework for representing a dataset and estimate its probability density function (PDF). Examples of its use include image processing [33], emotion recognition [34] and load forecasting [35]. GMM is a mixture of components and when used for clustering a dataset  …”
Section: Feature Extraction By Gaussian Mixture Data Clusteringmentioning
confidence: 99%