2020
DOI: 10.1109/access.2020.2984287
|View full text |Cite
|
Sign up to set email alerts
|

A Joint Neural Network for Session-Aware Recommendation

Abstract: Session-aware recommendation is a special form of sequential recommendation, where users' previous interactions before current session are available. Recently, Recurrent Neural Network (RNN) based models are widely used in sequential recommendation tasks with great success. Previous works mainly focus on the interaction sequences of the current session without analyzing a user's long-term preferences. In this paper, we propose a joint neural network (JNN) for session-aware recommendation, which employs a Convo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 33 publications
(44 reference statements)
0
7
0
Order By: Relevance
“…• Extend and Boost: We only report the combined effects of these extensions, and we denote the extended method by appending the postfix "_eb" (extend and boost) to the algorithm name, e.g., vsknn_eb. The reason is that the effectiveness of the individual extensions varied across algorithms, but combining the methods in almost all cases led to the highest performance improvements 9 . For the neural methods, gru4rec and narm, these extensions did not lead to positive effects in our initial experiments, which is why we do not report the results here.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…• Extend and Boost: We only report the combined effects of these extensions, and we denote the extended method by appending the postfix "_eb" (extend and boost) to the algorithm name, e.g., vsknn_eb. The reason is that the effectiveness of the individual extensions varied across algorithms, but combining the methods in almost all cases led to the highest performance improvements 9 . For the neural methods, gru4rec and narm, these extensions did not lead to positive effects in our initial experiments, which is why we do not report the results here.…”
Section: Resultsmentioning
confidence: 99%
“…Since we are given user-IDs for the sessions, we are able to apply a user-wise data splitting approach. Specifically, like in[31,33,30,9,47],…”
mentioning
confidence: 99%
“…We randomly collect 500,000 recommendation sessions (with 19,667,665 items) in temporal order, and leverage the first 80% sessions as the training/validation datasets and the later 20% sessions as the (offline) test dataset. For a new session, the initial state s 1 is N = 50 previously clicked or purchased items obtained from users' previous sessions [14]. The immediate reward r t of click/skip/leave behavior is empirically set as 1, 0, and -2, respectively.…”
Section: Experimental Settingsmentioning
confidence: 99%
“…However, when the time series is long, the RNN model experiences the gradient exploding problem and the gradient vanishing problem. Aiming at the problems in the prediction of long time series, the Long-Short Term Memory (LSTM) model introduces the gating mechanism to make up for the deficiency of the RNN model in long time series processing to some extent [28,29].…”
Section: Introductionmentioning
confidence: 99%