2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC) 2020
DOI: 10.1109/itsc45102.2020.9294483
|View full text |Cite
|
Sign up to set email alerts
|

Class-Incremental Learning for Semantic Segmentation Re-Using Neither Old Data Nor Old Labels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
45
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(51 citation statements)
references
References 21 publications
1
45
0
Order By: Relevance
“…Continual Learning (CL) is a more mature field where previous knowledge acquired on a subset of data is further expanded to learn new tasks (e.g., class labels) [19,20,21]. Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Continual Learning (CL) is a more mature field where previous knowledge acquired on a subset of data is further expanded to learn new tasks (e.g., class labels) [19,20,21]. Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25].…”
Section: Related Workmentioning
confidence: 99%
“…Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25]. Another promising research direction involves feature-level regularization to increase separation among features from different classes [23].…”
Section: Related Workmentioning
confidence: 99%
“…Klingner et al [114] note that the existing approaches are either restricted to settings in which the additional classes have no overlap with the old ones or rely on labels for both old and new classes. The authors introduce a generally applicable technique that learns new data solely from labels for the new classes and outputs of a pretrained teacher model.…”
Section: B Image Segmentationmentioning
confidence: 99%
“…only recently been explored and the first experimental studies show that catastrophic forgetting is even more severe than on the classification task [29,31]. Current approaches for class-incremental semantic segmentation re-frame knowledge distillation strategies inspired by previous works on image classification [29,5,22,31]. Although they partially alleviate forgetting, they often fail when multiple incremental steps are performed or when background shift [5] (i.e., change of statistics of the background across learning steps, as it incorporates old or future classes) occurs.…”
Section: Replay Imagesmentioning
confidence: 99%
“…Deep neural networks witnessed remarkable improvements in many fields; however, such models are prone to catastrophic forgetting when they are trained to continuously improve the learned knowledge (e.g., new categories) from progressively provided data [15]. Catastrophic forgetting is a long-standing problem [36,13] which has been recently tackled in a variety of visual tasks such as image classification [21,35,25,43,33], object detection [40,24] and semantic segmentation [29,31,5,22].…”
Section: Related Workmentioning
confidence: 99%