2020
DOI: 10.1016/j.neucom.2019.12.109
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent reverse attention guided residual learning for saliency object detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…It is possible that the training data is not abundant enough or the structure of FCUN needs further improvement. It could be hypothesized that the performance of the deep learning method for SpecR can be further promoted by increasing the network layers and using a more efficient network structure, such as attention and transformer [38].…”
Section: Discussionmentioning
confidence: 99%
“…It is possible that the training data is not abundant enough or the structure of FCUN needs further improvement. It could be hypothesized that the performance of the deep learning method for SpecR can be further promoted by increasing the network layers and using a more efficient network structure, such as attention and transformer [38].…”
Section: Discussionmentioning
confidence: 99%
“…However, this strategy does not effectively use other information from the prediction map. More recently, Li et al [26] introduced a cyclic reverse attention module into the SOD network. The cyclic reverse attention module focuses on salient areas and residual areas outside the salient areas.…”
Section: Attention-based Methodsmentioning
confidence: 99%
“…In the aspect of residual learning, the concept of Residual Network (ResNet) was proposed by He et al [15] from Microsoft Research Institute to deal with the performance degradation problem as the number of layers grows. ResNets have outperformed common CNNs in image classification and object recognition [16][17][18]. Because of residual learning, the depth of ResNets is deeper than that of the traditional CNNs.…”
Section: Related Workmentioning
confidence: 99%