2023
DOI: 10.1016/j.displa.2023.102459
|View full text |Cite
|
Sign up to set email alerts
|

Few-shot object segmentation with a new feature aggregation module

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 47 publications
0
1
0
Order By: Relevance
“…The process of fine-tuning is as follows: (1) extraction of pertinent features from the support set and query set using the pre-trained neural network; (2) derivation of class prototypes, which are representative examples for each category in the support set; (3) computation of similarity scores between query features and prototypes; and (4) assignment of logits, indicating the likelihood of query examples belonging to each support set category. Since a small amount of data may cause instability in model training, we also explored the method of feature aggregation [ 67 , 68 , 69 ] and data augmentation [ 70 ] for multiple support sets of each category separately to create more robust category prototypes. In addition, we apply feature regularization techniques to reduce over-fitting.…”
Section: Methodsmentioning
confidence: 99%
“…The process of fine-tuning is as follows: (1) extraction of pertinent features from the support set and query set using the pre-trained neural network; (2) derivation of class prototypes, which are representative examples for each category in the support set; (3) computation of similarity scores between query features and prototypes; and (4) assignment of logits, indicating the likelihood of query examples belonging to each support set category. Since a small amount of data may cause instability in model training, we also explored the method of feature aggregation [ 67 , 68 , 69 ] and data augmentation [ 70 ] for multiple support sets of each category separately to create more robust category prototypes. In addition, we apply feature regularization techniques to reduce over-fitting.…”
Section: Methodsmentioning
confidence: 99%