2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.00022
|View full text |Cite
|
Sign up to set email alerts
|

A Hyperbolic-to-Hyperbolic Graph Convolutional Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(34 citation statements)
references
References 19 publications
0
34
0
Order By: Relevance
“…NNs methods are also baseline model and they only exploit the feature of nodes, neglecting the graph-structured data. For GCNs and HYP GCNs, the stateof-art models, like H2H-GCN [12] and GCN [5], GRAPHSAGE [4], graph attention networks (GAT) [39], HGCN [8] and simplified graph convolution (SGC) [14].…”
Section: Node Classificationmentioning
confidence: 99%
“…NNs methods are also baseline model and they only exploit the feature of nodes, neglecting the graph-structured data. For GCNs and HYP GCNs, the stateof-art models, like H2H-GCN [12] and GCN [5], GRAPHSAGE [4], graph attention networks (GAT) [39], HGCN [8] and simplified graph convolution (SGC) [14].…”
Section: Node Classificationmentioning
confidence: 99%
“…Another line of methods focuses on global graph representation learning, which can be directly used for graph classification tasks [52,55]. The key notion is developing a multi-scale feature extraction pipeline to leverage different levels of topological information in graphs [7,19,46,48]. GIN is proposed as a graph classification method that is as powerful as the Weisfeiler-Lehman Isomorphism test [44].…”
Section: Related Work 21 Graph Neural Networkmentioning
confidence: 99%
“…The coefficient 𝜆 is selected from [0.005, 0.01, 0.02, 0.03]. For each dataset, the upper limit of BFS 𝛽 is selected from [3,5,7]. For larger datasets, the number of augmented edges is selected from [3,5,7], and from [1,2,3] for other datasets.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…The hyperbolic space has gained traction in deep learning literature for representing tree-like structures and taxonomies [18,20,29,30,36,38,47], text [2,42,55], and graphs [4,8,12,22,26,50]. Hyperbolic alternatives have been proposed for various network layers, from intermediate layers [17,39] to classification layers [3,11,17,39].…”
Section: Hyperbolic Deep Learningmentioning
confidence: 99%
“…Foundational work showed that hyperbolic manifolds are able to embed hierarchies and tree-like structures with minimal distortion [29]. Follow up work has demonstrated the benefits of hyperboles for various tasks with latent hierarchical structures, from text embedding [42,55] to graph inference [8,12,22]. Notably, Khrulkov et al [19] showed that hyperbolic embeddings also have profound connections to visual data, due to latent hierarchical structures present in vision datasets.…”
Section: Introductionmentioning
confidence: 99%