2021
DOI: 10.48550/arxiv.2112.06511
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Ex-Model: Continual Learning from a Stream of Trained Models

Abstract: Learning continually from non-stationary data streams is a challenging research topic of growing popularity in the last few years. Being able to learn, adapt, and generalize continually in an efficient, effective, and scalable way is fundamental for a sustainable development of Artificial Intelligent systems. However, an agent-centric view of continual learning requires learning directly from raw data, which limits the interaction between independent agents, the efficiency, and the privacy of current approache… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…Continual learning algorithm are evaluated by benchmarks: they specify how the stream of data is created by defining the originating dataset(s), the amount of samples, the criteria to split the data in different tasks or experiences [6] and so on. In literature, different benchmarks are used to evaluate results.…”
Section: Benchmarks and Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Continual learning algorithm are evaluated by benchmarks: they specify how the stream of data is created by defining the originating dataset(s), the amount of samples, the criteria to split the data in different tasks or experiences [6] and so on. In literature, different benchmarks are used to evaluate results.…”
Section: Benchmarks and Modelsmentioning
confidence: 99%
“…The second direction (Sec. 6) is related to the balancing of the memory buffer. In the literature the replay buffer is usually balanced to have an equal amount of samples of each past task or class.…”
Section: Introductionmentioning
confidence: 99%