2022
DOI: 10.48550/arxiv.2207.06680
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Equivariant Hypergraph Diffusion Neural Operators

Abstract: Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data and further solve relevant prediction tasks built upon such higher-order relations. However, higherorder relations in practice contain complex patterns and are often highly irregular. So, it is often challenging to design an HNN that suffices to express those relations while keeping computational efficiency. Inspired by hypergraph diffusion algorithms, this work proposes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 61 publications
0
0
0
Order By: Relevance
“…In addition, the model should also have good extrapolation ability for hyperedges of unseen orders. Inspired by recent works about hypergraph diffusion algorithms, 46,47 The molecular hypergraph is initially transformed into an equivalent bipartite graph [Fig. 2(b)], wherein two distinct sets of vertices denote the nodes and hyperedges of the molecular hypergraph, respectively.…”
Section: B Algorithmmentioning
confidence: 99%
“…In addition, the model should also have good extrapolation ability for hyperedges of unseen orders. Inspired by recent works about hypergraph diffusion algorithms, 46,47 The molecular hypergraph is initially transformed into an equivalent bipartite graph [Fig. 2(b)], wherein two distinct sets of vertices denote the nodes and hyperedges of the molecular hypergraph, respectively.…”
Section: B Algorithmmentioning
confidence: 99%
“…In contrast to MP on plain graphs, this architecture performs two levels of nested aggregations per MP step, since computing x new v requires knowing m e . There are numerous IMP architectures that harness HGs, for example Hypergraph Convolution [15], Hypergraph Neural Networks [97], Hypergraph Attention [15], Hyper-GCN [258], Hypergraph Networks with Hyperedge Neurons [91], Hyper-SAGNN [277], and others [10], [13], [102], [103], [171], [223], [243], [273].…”
Section: Neural Mp On Hypergraphs (Hgs)mentioning
confidence: 99%