2023
DOI: 10.1007/978-3-031-47637-2_1
|View full text |Cite
|
Sign up to set email alerts
|

Bridging Distinct Spaces in Graph-Based Machine Learning

Linlin Jia,
Xiao Ning,
Benoit Gaüzère
et al.

Abstract: Graph-based machine learning, encompassing Graph Edit Distances (GEDs), Graph Kernels, and Graph Neural Networks (GNNs), offers extensive capabilities and exciting potential. While each model possesses unique strengths for graph challenges, interrelations between their underlying spaces remain under-explored. In this paper, we introduce a novel framework for bridging these distinct spaces via GED cost learning. A supervised metric learning approach serves as an instance of this framework, enabling space alignm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 32 publications
(44 reference statements)
0
1
0
Order By: Relevance
“…In this paper, we provide theoretical insights on the pre-image problem and its resolution. This problem has been known for about 20 years [33], and is still of great interest nowadays, as shown recently in many frameworks: when dealing with nonlinear dictionary learning [40,31] and matrix completion [17]; when dealing with structured input spaces, such as in interpretable time series analytics [36], graph edit distances [23,24], representation learning on graphs [10] and structured prediction [15]; generative machines including generative kernel PCA [29] and multiview generation [30].…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, we provide theoretical insights on the pre-image problem and its resolution. This problem has been known for about 20 years [33], and is still of great interest nowadays, as shown recently in many frameworks: when dealing with nonlinear dictionary learning [40,31] and matrix completion [17]; when dealing with structured input spaces, such as in interpretable time series analytics [36], graph edit distances [23,24], representation learning on graphs [10] and structured prediction [15]; generative machines including generative kernel PCA [29] and multiview generation [30].…”
Section: Introductionmentioning
confidence: 99%