2020
DOI: 10.1109/tifs.2019.2935871
|View full text |Cite
|
Sign up to set email alerts
|

Target-Specific Siamese Attention Network for Real-Time Object Tracking

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 57 publications
0
3
0
Order By: Relevance
“…Huang et al [15] proposed an attentional online update paradigm for siamese visual tracking to improve the performance of a tracker by utilizing knowledge extracted from prior tracking tasks. In [32] residual attention modules are introduced in similarity tracking at multiple levels of feature representation, resulting in improved discrimination quality for similarity searching. Zhang et al [40] created an attention retrieval network that uses learning masks to conduct soft spatial constrains on features from a tracking backbone network, mitigating the impact of background clutter.…”
Section: Attention Mechanisms In Object Trackingmentioning
confidence: 99%
“…Huang et al [15] proposed an attentional online update paradigm for siamese visual tracking to improve the performance of a tracker by utilizing knowledge extracted from prior tracking tasks. In [32] residual attention modules are introduced in similarity tracking at multiple levels of feature representation, resulting in improved discrimination quality for similarity searching. Zhang et al [40] created an attention retrieval network that uses learning masks to conduct soft spatial constrains on features from a tracking backbone network, mitigating the impact of background clutter.…”
Section: Attention Mechanisms In Object Trackingmentioning
confidence: 99%
“…Attention mechanism is similar to human's visual attention: we always focus on the most important part of what we see. It has been widely utilized in many other fields, such as NLP [32,33] and CV [34,35]. Specifically, attention mechanism makes it easy to memorize various remote dependencies or focus on important parts of the input.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…Compared to the recent Siamese based tracker [44], the performance of our tracker is superior to [44] (We use the reported results for fair comparisons.) in terms of distance precision (DP: 91.1% vs. 87.7%) and overlap success rate (OS AUC : 66.6% vs. 66.4%).…”
Section: ) Otb-2015 Datasetmentioning
confidence: 99%