2020
DOI: 10.1109/tcyb.2018.2879859
|View full text |Cite
|
Sign up to set email alerts
|

Embedding Attention and Residual Network for Accurate Salient Object Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(11 citation statements)
references
References 59 publications
0
11
0
Order By: Relevance
“…Top-down Guidance. The deep layer contains high-level semantic information, which can be used as a guidance to help shallow layers filter out noisy distraction [6]. Such a top-down guidance manner was also widely applied in existing methods.…”
Section: Rgb Salient Object Detectionmentioning
confidence: 99%
“…Top-down Guidance. The deep layer contains high-level semantic information, which can be used as a guidance to help shallow layers filter out noisy distraction [6]. Such a top-down guidance manner was also widely applied in existing methods.…”
Section: Rgb Salient Object Detectionmentioning
confidence: 99%
“…The proposed method (LGFAN) is compared with eight state-of-the-art methods, including ASNet [40], BASNet [20], CPD [27] , DSS [16], EAR [41], GateNet [42], PoolNet [19] and R 3 Net [43]. To ensure the authenticity of the conclusions, the sources of prediction results adopted for model performance evaluation were all saliency maps published or generated from the original code.…”
Section: Resultsmentioning
confidence: 99%
“…An attention [41], [42] is intuited from visual attentions of human beings (incline to be attracted by more important parts of a target object). Attention is widely used in many fields, including object detection [43], [44], prediction [45], query suggestion [46], and recommendation [4]. In brief, attention can be used to increase the interpretability and adaptivity of complex models such as neural networks by calculating the weights of different data/information automatically.…”
Section: Attention Mechanismmentioning
confidence: 99%