2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489591
|View full text |Cite
|
Sign up to set email alerts
|

TA4REC: Recurrent Neural Networks with Time Attention Factors for Session-based Recommendations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…Li et al used a hybrid global-local encoder with attention mechanisms to capture the main purpose in the session [12]. The dwell time on an item is also considered to adjust the attention weights of previous clicks in the session [11]. Also, combination-based models were proposed to improve the performance [10,37].…”
Section: Session-based Recommendationmentioning
confidence: 99%
See 1 more Smart Citation
“…Li et al used a hybrid global-local encoder with attention mechanisms to capture the main purpose in the session [12]. The dwell time on an item is also considered to adjust the attention weights of previous clicks in the session [11]. Also, combination-based models were proposed to improve the performance [10,37].…”
Section: Session-based Recommendationmentioning
confidence: 99%
“…In recent years, with the development of deep learning-based technologies, SBR has achieved satisfying prediction accuracy. In general, methods that rely on Recurrent Neural Networks (RNNs) [7,[10][11][12][13] and Graph Neural Networks (GNNs) [14][15][16][17][18] mainly contribute to the progress in SBR.…”
Section: Introductionmentioning
confidence: 99%
“…Tan et al [80] deals content information in preprocessing step to reduce the effect of outdated features and the resultant subset is processed by the second model. Session based RecSyss have implemented effectively with GRU [85]. Recently contexts have also considered as input along with the sequence data [86][87][88][89].…”
Section: Gru For Recsys Modelsmentioning
confidence: 99%