2023
DOI: 10.1080/09540091.2023.2212883
|View full text |Cite
|
Sign up to set email alerts
|

CGRS: Collaborative Knowledge Propagation Graph Attention Network for Recipes Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 44 publications
0
0
0
Order By: Relevance
“…Other researchers have utilized graph attention networks (GATs) to aggregate important features from neighboring entities [26,[37][38][39][40][41][42]. For example, the KGAT [26] proposes using the knowledge graph attention network to obtain higher-order relationships in KG and to learn users' historical interests through attention mechanisms.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Other researchers have utilized graph attention networks (GATs) to aggregate important features from neighboring entities [26,[37][38][39][40][41][42]. For example, the KGAT [26] proposes using the knowledge graph attention network to obtain higher-order relationships in KG and to learn users' historical interests through attention mechanisms.…”
Section: Related Workmentioning
confidence: 99%
“…The KDGN [40] proposes the MedRec model, which introduces a medical knowledge graph and a medicine attribute graph to learn the embedding representations of symptoms and medicine which are used for medicine recommendations. CGRS [41] and CRKG [42] propose the collaborative knowledge propagation graph attention network for recipe recommendations, which employs a knowledge-aware attention graph convolutional neural network to capture the semantic associations between users and recipes on the collaborative knowledge graph and learns the users' requirements in both preference and health by fusing the losses of these two learning tasks.…”
Section: Related Workmentioning
confidence: 99%