2022
DOI: 10.1007/978-3-031-19775-8_6
|View full text |Cite
|
Sign up to set email alerts
|

Theoretical Understanding of the Information Flow on Continual Learning Performance

Abstract: Continual Learning (CL) has generated attention as a method of avoiding Catastrophic Forgetting (CF) in the sequential training of neural networks, improving network efficiency and adaptability to different tasks. Additionally, CL serves as an ideal setting for studying network behavior and Forward Knowledge Transfer (FKT) between tasks. Pruning methods for CL train subnetworks to handle the sequential tasks which allows us to take a structured approach to investigating FKT. Sharing prior subnetworks' weights … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 20 publications
0
0
0
Order By: Relevance
“…Other previous works which touched on adaptive CL include (Nguyen et al, 2019;Ramasesh et al, 2021;Doan et al, 2021;Andle et al, 2023;Lin et al, 2023). Veniat et al (2021) used modules and leveraged a task-driven prior over the exponential search space to decide which modules to reuse.…”
Section: Related Workmentioning
confidence: 99%
“…Other previous works which touched on adaptive CL include (Nguyen et al, 2019;Ramasesh et al, 2021;Doan et al, 2021;Andle et al, 2023;Lin et al, 2023). Veniat et al (2021) used modules and leveraged a task-driven prior over the exponential search space to decide which modules to reuse.…”
Section: Related Workmentioning
confidence: 99%