Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence 2021
DOI: 10.24963/ijcai.2021/197
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Periodicity and Interactivity in Multi-Interest Framework for Sequential Recommendation

Abstract: Sequential recommendation systems alleviate the problem of information overload, and have attracted increasing attention in the literature. Most prior works usually obtain an overall representation based on the user’s behavior sequence, which can not sufficiently reflect the multiple interests of the user. To this end, we propose a novel method called PIMI to mitigate this issue. PIMI can model the user’s multi-interest representation effectively by considering both the periodicity and interactivity in the ite… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 27 publications
(30 citation statements)
references
References 15 publications
0
28
0
Order By: Relevance
“…Subsequently, Cen et al [1] proposed a multi-interest extraction method based on self-attention mechanism (ComiRec-SA) to instead of capsule network. The latest method PIMI [2] introduced the time and periodic information in users' recent interaction sequences into the model. The self-attention mechanism was used to capture users' multi-interests and achieved the state-of-art performance.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Subsequently, Cen et al [1] proposed a multi-interest extraction method based on self-attention mechanism (ComiRec-SA) to instead of capsule network. The latest method PIMI [2] introduced the time and periodic information in users' recent interaction sequences into the model. The self-attention mechanism was used to capture users' multi-interests and achieved the state-of-art performance.…”
Section: Related Workmentioning
confidence: 99%
“…Firstly, the sequence length threshold 𝐿 𝑟𝑒𝑐 is set to intercept each user's most recent interactions, which does not include the target item for training. If there are less than 𝐿 𝑟𝑒𝑐 items, the padding strategy [2] will be used. Eventually, each user 𝑢 has a fixed-length…”
Section: Recent Time Interval Representation Modulementioning
confidence: 99%
See 3 more Smart Citations