Fourteenth ACM Conference on Recommender Systems 2020
DOI: 10.1145/3383313.3412218
|View full text |Cite
|
Sign up to set email alerts
|

ADER: Adaptively Distilled Exemplar Replay Towards Continual Learning for Session-based Recommendation

Abstract: Session-based recommendation has received growing attention recently due to the increasing privacy concern. Despite the recent success of neural session-based recommenders, they are typically developed in an offline manner using a static dataset. However, recommendation requires continual adaptation to take into account new and obsolete items and users, and requires "continual learning" in real-life applications. In this case, the recommender is updated continually and periodically with new data that arrives i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(28 citation statements)
references
References 34 publications
0
28
0
Order By: Relevance
“…Though these methods have shown promising results in CV and NLP, how to adapt them to incrementally update RS models is non-trivial, as they lack mechanism to explicitly optimize for the future performance. To overcome the specific forgetting problem in RSs incremental update, two lines of research have been developed in recent years, which are closely related to some of the methods in continual learning [13].…”
Section: Related Work 21 Continual Learningmentioning
confidence: 99%
“…Though these methods have shown promising results in CV and NLP, how to adapt them to incrementally update RS models is non-trivial, as they lack mechanism to explicitly optimize for the future performance. To overcome the specific forgetting problem in RSs incremental update, two lines of research have been developed in recent years, which are closely related to some of the methods in continual learning [13].…”
Section: Related Work 21 Continual Learningmentioning
confidence: 99%
“…There are very few work on CTR prediction addressing the non-stationary and drifting data problem. A recent work studied session-based recommendation task in continual learning setting by utilizing memorybased method to mitigate catastrophic forgetting (Mi, Lin, and Faltings 2020). However, it gave no formal treatment to the non-stationary and drifting pattern of real world data and relied on heuristics to populate memory, with no emphasis on positive knowledge transfer, which is opposite to our method.…”
Section: Ctr Predictionmentioning
confidence: 98%
“…ADER ADER is a continual learning algorithm targeting at session-based recommendation task (Mi, Lin, and Faltings 2020). It updates memory according to items' historical frequency only, thus may suffer from insufficient positive transfer.…”
Section: Continual Learning Algorithms For Comparisonmentioning
confidence: 99%
“…In fact, it is largely unknown whether the learning paradigms, frameworks, and methodologies for other domains are useful or not to address our problem. Meanwhile, there are also some recent work in [23,29,30] claiming that RS models should have the so-called 'lifelong' learning capacity. However, their methodologies are designed only to model long-term user behaviors or new training data from the…”
Section: Continual Learningmentioning
confidence: 99%
“…Note recent work in[23,30] also introduced a 'lifelong' learning solution for RS, the main dierence between our paper and them is described in Section 2.3.…”
mentioning
confidence: 99%