2021
DOI: 10.48550/arxiv.2108.10781
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adaptive Explainable Continual Learning Framework for Regression Problems with Focus on Power Forecasts

Abstract: Compared with traditional deep learning techniques, continual learning enables deep neural networks to learn continually and adaptively. Deep neural networks have to learn new tasks and overcome forgetting the knowledge obtained from the old tasks as the amount of data keeps increasing in applications. In this article, two continual learning scenarios will be proposed to describe the potential challenges in this context. Besides, based on our previous work regarding the CLeaR framework, which is short for cont… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 27 publications
(29 reference statements)
0
4
0
Order By: Relevance
“…The results show that CLOPS performs better in three scenarios than the two other frontier methods—GEM [ 134 ] and MIR [ 135 ]. He [ 111 ] also focused on the target-domain incremental application scenario and data-domain incremental application scenario of continual learning and described how their previous framework, CLeaR, can be extended to learn inputs successively. The framework utilizes the storage of buffered data by a novelty detector.…”
Section: Advances In Continual Learning Methods For Time Series Modelingmentioning
confidence: 99%
See 3 more Smart Citations
“…The results show that CLOPS performs better in three scenarios than the two other frontier methods—GEM [ 134 ] and MIR [ 135 ]. He [ 111 ] also focused on the target-domain incremental application scenario and data-domain incremental application scenario of continual learning and described how their previous framework, CLeaR, can be extended to learn inputs successively. The framework utilizes the storage of buffered data by a novelty detector.…”
Section: Advances In Continual Learning Methods For Time Series Modelingmentioning
confidence: 99%
“…This survey will follow the taxonomy of Lange et al [17] and Mundt et al [103]. [93,94,[106][107][108][109][110][111][112][113][114][115][116] (note: if a paper uses two methods separately with similar satisfactory results, the paper will be listed under both groups).…”
Section: Advances In Continual Learning Methods For Time Series Modelingmentioning
confidence: 99%
See 2 more Smart Citations