2012
DOI: 10.1007/s11434-012-5485-4
|View full text |Cite
|
Sign up to set email alerts
|

An improved EM algorithm for remote sensing classification

Abstract: The use of a general EM (expectation-maximization) algorithm in multi-spectral image classification is known to cause two problems: singularity of the variance-covariance matrix and sensitivity of randomly selected initial values. The former causes computation failure; the latter produces unstable classification results. This paper proposes a modified approach to resolve these defects. First, a modification is proposed to determine reliable parameters for the EM algorithm based on a k-means algorithm with init… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…β is initialized with a value of approximately 0. The parameter β can roughly be interpreted as the inverse of temperature [13]. At each β value algorithm iterates E step and M step until convergence.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…β is initialized with a value of approximately 0. The parameter β can roughly be interpreted as the inverse of temperature [13]. At each β value algorithm iterates E step and M step until convergence.…”
Section: Methodsmentioning
confidence: 99%
“…Wishing to improve one of the EM algorithm's disadvantages, the author improved the EM algorithm by dividing the point cloud into smaller point clouds vertically, using the pPCA (Probabilistic Principle Component Analysis) model to estimate parameters for GMM (this model is reduced in number of dimensions according to PCA (Probability Component Analysis)), use EM to conduct point cloud classification (applied to each point cloud part) and assess the degree exactly. To improve the algorithm's convergence time, the scheduling parameter β is used, where β is initialized with a very small value (approximately 0) [13].…”
Section: Introduction mentioning
confidence: 99%
“…In image classification, this technique achieves an accuracy of 84%. Comparably, in [16], the authors debated on an introduction to classification and regression tree (CART). ey explained how to improve the organization correctness of AVIRIS and Landsat digital pictures.…”
Section: Related Workmentioning
confidence: 99%
“…Kearns and Valiant are the first to transform the PAC module's weak learning algorithm, which is only somewhat improved than random guessing, into a strong and accurate learning algorithm, but these algorithms have practical difficulties [23]. Freund and Schapire suggested the AdaBoost algorithm in 1995 to address the practical shortcomings of many early boosting algorithms [16]. e AdaBoost algorithm is a member of the boosting family of algorithms.…”
Section: Boosting Algorithmmentioning
confidence: 99%
“…Yang HongLei et al [12] focused on remote sensing image classification by using EM (ExpectationMaximization) algorithm. The EM algorithm is used for the tasks in remote sensing image classification [5][6][7][8][9][10][11].…”
Section: Related Workmentioning
confidence: 99%