2018
DOI: 10.1609/aaai.v32i1.11769
|View full text |Cite
|
Sign up to set email alerts
|

Less-Forgetful Learning for Domain Expansion in Deep Neural Networks

Abstract: Expanding the domain that deep neural network has already learned without accessing old domain data is a challenging task because deep neural networks forget previously learned information when learning new data from a new domain. In this paper, we propose a less-forgetful learning method for the domain expansion scenario. While existing domain adaptation techniques solely focused on adapting to new domains, the proposed technique focuses on working well with both old and new domains without needing to know wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(15 citation statements)
references
References 15 publications
0
15
0
Order By: Relevance
“…Prior-Focused • Requires explicit storage for previous tasks' models Less-Forgetting Learning (LFL) [131] Parameter-Constraint…”
Section: Parameter-isolationmentioning
confidence: 99%
“…Prior-Focused • Requires explicit storage for previous tasks' models Less-Forgetting Learning (LFL) [131] Parameter-Constraint…”
Section: Parameter-isolationmentioning
confidence: 99%
“…These methods can further be divided into data-focused and prior-focused methods. Knowledge distillation from a previous model to the trained model on the new data is the primary building block in datafocused methods [36], [37], [38], [39]. Prior-focused methods mitigate forgetting by estimating a distribution over the model parameters used prior when learning from new data [40], [41], [42], [43], [44], [45].…”
Section: B Current State Of the Art And Challengesmentioning
confidence: 99%
“…Regularisation‐based methods [10, 16, 18–22] add more constraints to the cost function to maintain performance on the old data set. Typical methods include LwF [10] and elastic weight consolidation (EWC) [16].…”
Section: Related Workmentioning
confidence: 99%