Proceedings of the Eleventh ACM Conference on Recommender Systems 2017
DOI: 10.1145/3109859.3109917
|View full text |Cite
|
Sign up to set email alerts
|

Modeling User Session and Intent with an Attention-based Encoder-Decoder Architecture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 58 publications
(30 citation statements)
references
References 13 publications
0
29
0
Order By: Relevance
“…Given the classical collaborative filtering scenario with user-item interaction behavior, NAIS extended the classical item based recommendation models by distinguishing the importance of different historical items in a user profile [19]. With users' temporal behavior, the attention networks were proposed to learn which historical behavior is more important for the user's current temporal decision [31], [32]. A lot of attention based recommendation models have been developed to better exploit the auxiliary information to improve recommendation performance.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Given the classical collaborative filtering scenario with user-item interaction behavior, NAIS extended the classical item based recommendation models by distinguishing the importance of different historical items in a user profile [19]. With users' temporal behavior, the attention networks were proposed to learn which historical behavior is more important for the user's current temporal decision [31], [32]. A lot of attention based recommendation models have been developed to better exploit the auxiliary information to improve recommendation performance.…”
Section: Related Workmentioning
confidence: 99%
“…In the social embedding process with Deepwalk [37], we set the parameters as: the window size w = 10 and walks per vertex ρ = 80. The social embedding size d is set in the range [32,64,128]. We find when d = 128, the social embedding reaches the best performance.…”
Section: Evaluation Metricsmentioning
confidence: 99%
“…Clearly sessions without a search belong to informational (browsing) or transactional (buying) sessions type [90]. When comparing search and nonsearch sessions, researchers report that 46.7% of actions belongs to the user transactions (e.g., mailing), 19.9% to the Web browsing (e.g., news reading), 16.3% to facts searching (e.g., looking for relevant statements), 13.5% to information gathering (e.g., looking for bus departures) and 1.7% to other non classified actions [74].…”
Section: Session Intent Identificationmentioning
confidence: 99%
“…The fusion of deep learning-based models and the attention mechanism can help models emphasize informative features and suppress useless ones. At present, attention has been widely applied in speech recognition [28], answer selection [29], and session prediction [30]. Although the use of attention mechanism is not uncommon in speech enhancement, there are three reasons why we think it can play a role: First, in a noisy environment, the human auditory system can selectively focus on speech while suppressing noise through the attention mechanism [31].…”
Section: Introductionmentioning
confidence: 99%