2017
DOI: 10.1007/978-3-319-66182-7_54
|View full text |Cite
|
Sign up to set email alerts
|

Distance Metric Learning Using Graph Convolutional Networks: Application to Functional Brain Networks

Abstract: Abstract. Evaluating similarity between graphs is of major importance in several computer vision and pattern recognition problems, where graph representations are often used to model objects or interactions between elements. The choice of a distance or similarity metric is, however, not trivial and can be highly dependent on the application at hand. In this work, we propose a novel metric learning method to evaluate distance between graphs that leverages the power of convolutional neural networks, while exploi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
92
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 147 publications
(97 citation statements)
references
References 12 publications
(19 reference statements)
1
92
0
Order By: Relevance
“…The ability to compute node embeddings has given rise to many application, such as link prediction (Schlichtkrull et al, 2017), (semi-supervised) node classification in citation networks (Kipf and Welling, 2016a), performing logical reasoning tasks (Li et al, 2016b), finding correspondences across meshes (Monti et al, 2017) or point cloud segmentation (Chapter 4). Graph embeddings has also been useful in many tasks, for example measuring similarity of brain networks by metric learning (Ktena et al, 2017), suggesting heuristic moves for approximating NP-hard problems (Dai et al, 2017), predicting satisfaction of formal properties in computer programs (Li et al, 2016b), classifying chemical effects of molecules (Chapter 3) and regressing their physical properties (Gilmer et al, 2017), or point cloud classification (Chapter 3).…”
Section: Embedding Graphs and Their Nodesmentioning
confidence: 99%
“…The ability to compute node embeddings has given rise to many application, such as link prediction (Schlichtkrull et al, 2017), (semi-supervised) node classification in citation networks (Kipf and Welling, 2016a), performing logical reasoning tasks (Li et al, 2016b), finding correspondences across meshes (Monti et al, 2017) or point cloud segmentation (Chapter 4). Graph embeddings has also been useful in many tasks, for example measuring similarity of brain networks by metric learning (Ktena et al, 2017), suggesting heuristic moves for approximating NP-hard problems (Dai et al, 2017), predicting satisfaction of formal properties in computer programs (Li et al, 2016b), classifying chemical effects of molecules (Chapter 3) and regressing their physical properties (Gilmer et al, 2017), or point cloud classification (Chapter 3).…”
Section: Embedding Graphs and Their Nodesmentioning
confidence: 99%
“…Significant progress has been made using functional magnetic resonance imaging (fMRI) to characterize the brain remodeling in ASD [9]. Recently, emerging research on Graph Neural Networks (GNNs) has combined deep learning with graph representation and applied an arXiv:1907.01661v2 [cs.LG] 12 Jul 2019 integrated approach to fMRI analysis in different neuro-disorders [11]. Most existing approaches (based on Graph Convolutional Network (GCN) [10]) require all nodes in the graph to be present during training and thus lack natural generalization on unseen nodes.…”
Section: Introductionmentioning
confidence: 99%
“…In this context, [15], [20] achieved a maximum accuracy of 70.4% and 70.86%, using as feature vector(s) either the similarity matrix or the mean time series for each of the 111 regions of the Harvard-Oxford atlas [12]. Importantly, another work, [19], based on GCN reported a high variability in the accuracy achieved across sites (50% to 90%). Finally, the highest accuracy in classifying participants in the ABIDE database so far was obtained by [18] by using an ensemble learning strategy on GCN.…”
Section: Prior Work and Our Contributionmentioning
confidence: 96%