2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC) 2020
DOI: 10.1109/itsc45102.2020.9294483 View full text |Buy / Rent full text
|
|
Help me understand this report

Search citation statements

Order By: Relevance
Select...
3
1
1
1
32
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

1
32
0
Order By: Relevance
“…Continual Learning (CL) is a more mature field where previous knowledge acquired on a subset of data is further expanded to learn new tasks (e.g., class labels) [19,20,21]. Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25].…”
Section: Related Workmentioning
See 1 more Smart Citation
Create an account to read the remaining citation statements from this report. You will also get access to:
  • Search over 1b+ citation statments to see what is being said about any topic in the research literature
  • Advanced Search to find publications that support or contrast your research
  • Citation reports and visualizations to easily see what publications are saying about each other
  • Browser extension to see Smart Citations wherever you read research
  • Dashboards to evaluate and keep track of groups of publications
  • Alerts to stay on top of citations as they happen
  • Automated reference checks to make sure you are citing reliable research in your manuscripts
  • 14 day free preview of our premium features.

Trusted by researchers and organizations around the world

Over 100,000 students researchers, and industry experts at use scite

See what students are saying

rupbmjkragerfmgwileyiopcupepmcmbcthiemesagefrontiersapsiucrarxivemeralduhksmucshluniversity-of-gavle
“…Continual Learning (CL) is a more mature field where previous knowledge acquired on a subset of data is further expanded to learn new tasks (e.g., class labels) [19,20,21]. Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25].…”
Section: Related Workmentioning
“…Continual learning has recently been investigated also in semantic segmentation [3,22,23,24,25,26]. The most popular strategy to retain knowledge from the previous learning step is the usage of knowledge distillation constraints at either the output [3,23,24,22] or feature level [3,25]. Another promising research direction involves feature-level regularization to increase separation among features from different classes [23].…”
Section: Related Workmentioning
“…Klingner et al [114] note that the existing approaches are either restricted to settings in which the additional classes have no overlap with the old ones or rely on labels for both old and new classes. The authors introduce a generally applicable technique that learns new data solely from labels for the new classes and outputs of a pretrained teacher model.…”
Section: B Image Segmentationmentioning
“…only recently been explored and the first experimental studies show that catastrophic forgetting is even more severe than on the classification task [29,31]. Current approaches for class-incremental semantic segmentation re-frame knowledge distillation strategies inspired by previous works on image classification [29,5,22,31]. Although they partially alleviate forgetting, they often fail when multiple incremental steps are performed or when background shift [5] (i.e., change of statistics of the background across learning steps, as it incorporates old or future classes) occurs.…”
Section: Replay Imagesmentioning
“…Deep neural networks witnessed remarkable improvements in many fields; however, such models are prone to catastrophic forgetting when they are trained to continuously improve the learned knowledge (e.g., new categories) from progressively provided data [15]. Catastrophic forgetting is a long-standing problem [36,13] which has been recently tackled in a variety of visual tasks such as image classification [21,35,25,43,33], object detection [40,24] and semantic segmentation [29,31,5,22].…”
Section: Related Workmentioning