Procedings of the British Machine Vision Conference 1990 1990
DOI: 10.5244/c.4.57
|View full text |Cite
|
Sign up to set email alerts
|

Symbolic image matching by simulated annealing

Abstract: In this paper we suggest an optimization approach to visual matching. We assume that the information available in an image may be conveniently represented symbolically in a relational graph. We concentrate on the problem of matching two such graphs. First we derive a cost function associated with graph matching and more precisely associated with relational subgraph isomorphism and with maximum relational subgraph matching. This cost function is well suited for optimization methods such as simulated annealing. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

1995
1995
2007
2007

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(17 citation statements)
references
References 15 publications
0
17
0
Order By: Relevance
“…To this end, various types of heuristic functions have been developed to prune the A* search space [Tsai and Fu 1979;Shapiro and Haralick 1981;Bunke and Allermann 1983;Sanfeliu and Fu 1983;Wong et al 1990]. Other methods such as simulated annealing [Herault et al 1990], neural networks [Feng et al 1994], probablistic relaxation [Christmas et al 1995], genetic algorithms [Wang et al 1997], and graph decomposition [Messmer and Bunke 1998] can also be used to reduce the computational cost. Observe that all of these optimization methods are developed for un-constrained matching where the matched subgraphs can assume any topology.…”
Section: Graph Matchingmentioning
confidence: 99%
“…To this end, various types of heuristic functions have been developed to prune the A* search space [Tsai and Fu 1979;Shapiro and Haralick 1981;Bunke and Allermann 1983;Sanfeliu and Fu 1983;Wong et al 1990]. Other methods such as simulated annealing [Herault et al 1990], neural networks [Feng et al 1994], probablistic relaxation [Christmas et al 1995], genetic algorithms [Wang et al 1997], and graph decomposition [Messmer and Bunke 1998] can also be used to reduce the computational cost. Observe that all of these optimization methods are developed for un-constrained matching where the matched subgraphs can assume any topology.…”
Section: Graph Matchingmentioning
confidence: 99%
“…A summary of the matching statistics is given in the first row of Table 1. Initial probabilities have been computed using equation (22). The effectiveness of matching is critically sensitive neither to the mean orientation parameter nor its variance.…”
Section: Matching Experimentsmentioning
confidence: 99%
“…Algorithms falling into this category can be divided into those that realise the matching process via discrete or configurational optimisation and those that adopt a continuous evidence combining framework. The former category includes the stochastic relaxation method of Herault et al (22) and the iterative discrete relaxation method of Wilson and Hancock. (16'17) The evidence combining approach is exemplified by probabilistic relaxation.…”
Section: T Introductionmentioning
confidence: 99%
“…In [27], the advantages and disadvantages of continuous optimization methods such as neural networks compared to the optimal backtracking methods are examined. Other continuous optimization approaches include the application of simulated annealing [28], genetic algorithms [29], [30], [31], and probabilistic relaxation [32].…”
Section: Introductionmentioning
confidence: 99%