2022 41st Chinese Control Conference (CCC) 2022
DOI: 10.23919/ccc55666.2022.9901938
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Hypergraph Convolutional Neural Networks for Image Annotation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(13 citation statements)
references
References 41 publications
0
13
0
Order By: Relevance
“…where X E ∈ R m×d denotes the hyperedge features, and f V→E (⋅) and f E→V (⋅) are two feature aggregation functions. More recent developments include models by Wang et al(2023a) and Wang et al(2023b). Wang et al(2023a) design the model from a hypergraph diffusion perspective, while Wang et al(2023b) unroll an optimisation algorithm minimising an hypergraph energy function.…”
Section: Gnns and Hypergnnsmentioning
confidence: 99%
See 4 more Smart Citations
“…where X E ∈ R m×d denotes the hyperedge features, and f V→E (⋅) and f E→V (⋅) are two feature aggregation functions. More recent developments include models by Wang et al(2023a) and Wang et al(2023b). Wang et al(2023a) design the model from a hypergraph diffusion perspective, while Wang et al(2023b) unroll an optimisation algorithm minimising an hypergraph energy function.…”
Section: Gnns and Hypergnnsmentioning
confidence: 99%
“…More recent developments include models by Wang et al(2023a) and Wang et al(2023b). Wang et al(2023a) design the model from a hypergraph diffusion perspective, while Wang et al(2023b) unroll an optimisation algorithm minimising an hypergraph energy function. In the following, we concentrate on five state-of-the-art HyperGNNs: UniGNN , AllDeepSets (Chien et al, 2022), AllSetTransformer (Chien et al, 2022), ED-HNN (Wang et al, 2023a), and PhenomNN (Wang et al, 2023b).…”
Section: Gnns and Hypergnnsmentioning
confidence: 99%
See 3 more Smart Citations