1996
DOI: 10.1117/12.242043
|View full text |Cite
|
Sign up to set email alerts
|

<title>Unified approach to feature extraction for model-based ATR</title>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

1998
1998
2010
2010

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Automated real-time identification of identical features in two separate images is the subject of much investigation [6] [7] and although much progress has been made, the general capability allowing operation in general terrain environment encountered in military scenarios does not exist. At the current state of the art a human is still required to look at two photographs and select common points.…”
Section: Color Difference Image Control Point Entrymentioning
confidence: 99%
“…Automated real-time identification of identical features in two separate images is the subject of much investigation [6] [7] and although much progress has been made, the general capability allowing operation in general terrain environment encountered in military scenarios does not exist. At the current state of the art a human is still required to look at two photographs and select common points.…”
Section: Color Difference Image Control Point Entrymentioning
confidence: 99%
“…The jX(i; j) ? n (i; j)j term in equation (2) can rapidly determine the spatial locations of poor template ts. The hypotheses which provide n 0 (i; j) over the local area are now searched.…”
Section: Mbv For Template Perturbationmentioning
confidence: 99%
“…Some techniques segment images to obtain targets, which are then modeled, [1]- [3]. Others implicitly recognize targets by employing feature extraction, [4], [5]. Furthermore, neural networks, statistical methods, etc., or a combination thereof have also been used, [6]- [9].…”
Section: Introductionmentioning
confidence: 99%