2020
DOI: 10.48550/arxiv.2003.13947
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SS-IL: Separated Softmax for Incremental Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…We also show that the mitigation of the old representation drift does not hinder the ability to learn and discriminate the new classes from the old ones. This property emerges from only learning the incoming data in isolation; as we will see, also isolating the rehearsal step (as in Ahn et al (2020)) leads to poor knowledge acquisition on the current task. Furthermore we show our ER-ACE objective can be combined with existing methods, leading to additional gains.…”
Section: Introductionmentioning
confidence: 90%
See 2 more Smart Citations
“…We also show that the mitigation of the old representation drift does not hinder the ability to learn and discriminate the new classes from the old ones. This property emerges from only learning the incoming data in isolation; as we will see, also isolating the rehearsal step (as in Ahn et al (2020)) leads to poor knowledge acquisition on the current task. Furthermore we show our ER-ACE objective can be combined with existing methods, leading to additional gains.…”
Section: Introductionmentioning
confidence: 90%
“…Hou et al (2019) considers addressing this imbalance through applying cosine similarity based loss as opposed to the typical cross entropy loss along with a distillation loss and a margin based loss with negatives mining to preserve the feature of previous classes. Recently, Ahn et al (2020) propose to learn the incoming tasks and the previous tasks separately. They use a masked softmax loss for the incoming and rehearsal data, to counter the class imbalance.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation