2020
DOI: 10.48550/arxiv.2006.15524
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MgSvF: Multi-Grained Slow vs. Fast Framework for Few-Shot Class-Incremental Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Incremental few-shot learning [47,41,60,9,8] aims to incrementally learn from very few samples. TOPCI [47] proposes a neural gas network to learn and preserve the topology of the feature manifold formed by different classes.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Incremental few-shot learning [47,41,60,9,8] aims to incrementally learn from very few samples. TOPCI [47] proposes a neural gas network to learn and preserve the topology of the feature manifold formed by different classes.…”
Section: Related Workmentioning
confidence: 99%
“…Current research. The study of incremental few-shot learning has just started [47,41,60,9,8,34,59]. Current works mainly borrow ideas from research in incremental learning to overcome the forgetting problem, by enforcing strong constraints on model parameters to penalize the changes of parameters [34,28,56], or by saving a small amount of exemplars from old classes and adding constraints on the exemplars to avoid forgetting [40,20,4].…”
Section: Introductionmentioning
confidence: 99%