2020
DOI: 10.1016/j.neucom.2020.03.096
|View full text |Cite
|
Sign up to set email alerts
|

Semantic-spatial fusion network for human parsing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Ref. [20] innovatively proposed a fusion framework (SSFNet), which effectively mitigates the gap between features by means of a semantic modulation model and a resolutionaware model. There are still many deep learning-based models that are applied for different tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Ref. [20] innovatively proposed a fusion framework (SSFNet), which effectively mitigates the gap between features by means of a semantic modulation model and a resolutionaware model. There are still many deep learning-based models that are applied for different tasks.…”
Section: Related Workmentioning
confidence: 99%
“…For the transformer, we adopt the slot attention [22] with learnable slots as encoder and use MLP as decoder, where the iteration of slot attention is set to 3. We extract the foreground mask by SSFNet [43] to concentrate on the part discovery. For fair comparisons, we do the same operation for other unsupervised face segmentation methods that required foreground masks.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…For the problems of underwater environment interference and algorithm real time, Cai et al [14] proposed a collaborative multi-AUV target recognition method based on migration reinforcement learning. Zhang et al [15] proposed a semantic spatial fusion network (SSFNet) to bridge the gap between low-level and high-level features. Moniruzzaman et al [16] proposed a Faster R-CNN algorithm using the Inception V2 network.…”
Section: Related Workmentioning
confidence: 99%