2021
DOI: 10.48550/arxiv.2105.14428
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DAGNN: Demand-aware Graph Neural Networks for Session-based Recommendation

Abstract: Session-based recommendations have been widely adopted for various online video and E-commerce Websites. Most existing approaches are intuitively proposed to discover underlying interests or preferences out of the anonymous session data. This apparently ignores the fact these sequential behaviors usually reflect session user's potential demand, i.e., a semantic level factor, and therefore how to estimate underlying demands from a session is challenging. To address aforementioned issue, this paper proposes a de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…The model CoHHN proposed by Zhang et al (2022) introduces price information. Yang et al (2021) proposed a demandbased graphical neural network (DAGNN). In particular, a requirements modeling component is proposed, which first extracts the session requirements and then uses the global requirements matrix to estimate the multiple basic requirements for each session.…”
Section: Session-based Recommendationmentioning
confidence: 99%
“…The model CoHHN proposed by Zhang et al (2022) introduces price information. Yang et al (2021) proposed a demandbased graphical neural network (DAGNN). In particular, a requirements modeling component is proposed, which first extracts the session requirements and then uses the global requirements matrix to estimate the multiple basic requirements for each session.…”
Section: Session-based Recommendationmentioning
confidence: 99%
“…For our evaluation on Cora, Citeseer, Pubmed, and OGBN-ArXiv, we have closely followed the data split settings and metrics reported by the recent benchmark [49]. See details in Appendix C. For comparison with SOTA models, we have used JKNet [50], InceptionGCN [51], SGC [52], GAT [3], GCNII [24], and DAGNN [53]. We use Adam optimizer for our experiments and performed a grid search to tune hyperparameters for our proposed methods and reported our settings in Table 1.…”
Section: Dataset and Experimental Setupmentioning
confidence: 99%