1996
DOI: 10.1006/gmip.1996.0002
|View full text |Cite
|
Sign up to set email alerts
|

A Hierarchical Markov Random Field Model and Multitemperature Annealing for Parallel Image Classification

Abstract: cation [6,7, 14, 15]. It is well known that multigrid methods can improve significantly the convergence rate and the In this paper, we are interested in massively parallel multiscale relaxation algorithms applied to image classification. quality of the final results of iterative relaxation techniques. It is well known that multigrid methods can improve signifi-There are many approaches in multigrid image segmencantly the convergence rate and the quality of the final results tation. A well known approach is the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
50
0

Year Published

1997
1997
2010
2010

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 71 publications
(57 citation statements)
references
References 17 publications
0
50
0
Order By: Relevance
“…We construct a pyramidal graphical model shown in Figure 1 (top) by placing the original field at the bottom of the hierarchy and introducing hidden variables at coarser scales. Unlike multigrid methods and the models considered in [2], the measurements are not replicated at coarser scales. We denote the coarsest scale in our pyramidal graph as Scale 1 and the finest scale as Scale M.…”
Section: Pyramidal Graphsmentioning
confidence: 99%
See 1 more Smart Citation
“…We construct a pyramidal graphical model shown in Figure 1 (top) by placing the original field at the bottom of the hierarchy and introducing hidden variables at coarser scales. Unlike multigrid methods and the models considered in [2], the measurements are not replicated at coarser scales. We denote the coarsest scale in our pyramidal graph as Scale 1 and the finest scale as Scale M.…”
Section: Pyramidal Graphsmentioning
confidence: 99%
“…Other approaches, motivated by multigrid methods, use multiple-scale algorithms for computational efficiency but do not have consistent stochastic structures between different scales. These limitations have been recognized by a number of researchers, who consider models that incorporate both intra-and interscale interactions [1], [2]. However, due to the resulting model complexity, they either allow only a limited extension of multiscale trees or use computationally expensive methods such as simulated annealing to get solutions.…”
Section: Introductionmentioning
confidence: 99%
“…To address the problems associated with nonhierarchical models, multiscale MRF models were formulated and have been extensively discussed in the image processing literature (Bouman and Shapiro, 1994;Kato et al 1996Kato et al , 1999Laferté et al, 2000;Liang and Tjahjadi, 2006;Mignotte et al, 2000;Wilson and Li, 2003). In those hierarchical MRF models, there is a series of random fields at a range of scales or resolutions, and the random field at each scale depends only on the next coarser random field above it.…”
Section: Related Workmentioning
confidence: 99%
“…In our multiscale MRF model, the value of a site at a given scale depends not only on its parent in the layer above but also on its neighbors at the same scale. In this respect, our model is closely related to the models presented in (Kato et al 1996(Kato et al , 1999Mignotte et al, 2000;Wilson and Li, 2003). However, unlike the models described by these authors, we solve the statistical inference problem by means of a sequence of related multi-resolution problems rather than as a single problem representing the entire quadtree.…”
Section: A Multiscale Mrf Modelmentioning
confidence: 99%
“…These models can be applied in a non-iterative way on simple regular structures of quadtrees [26][27][28][29][30][31] or on more complex, however still regular, trees which try to overcome the blockiness of the classification result that is related to the nonstationarity of MRFs on quadtrees [26,32].…”
mentioning
confidence: 99%