2020
DOI: 10.1016/j.patrec.2018.03.031
|View full text |Cite
|
Sign up to set email alerts
|

Learning error-correcting graph matching with a multiclass neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…On the other hand, deletion costs are implicitly set to a specific value (that is to say 0) in the GM problem. Many learning methods aim at learning edit costs (Serratosa, 2020;Martineau et al, 2020) or matching similarities (Zanfir and Sminchisescu, 2018;Caetano et al, 2007). Learned matching similarities may include implicitly deletion and insertion costs.…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, deletion costs are implicitly set to a specific value (that is to say 0) in the GM problem. Many learning methods aim at learning edit costs (Serratosa, 2020;Martineau et al, 2020) or matching similarities (Zanfir and Sminchisescu, 2018;Caetano et al, 2007). Learned matching similarities may include implicitly deletion and insertion costs.…”
Section: Discussionmentioning
confidence: 99%
“…Ref. [16] This study proposes an efficient method for solving the graph matching issue in a classification setting. Ref.…”
Section: Rq2: How To Perform Multiclass Classification Using Unlabele...mentioning
confidence: 99%
“…The comparison is restricted to the five datasets considered in this work, with a dash (-) indicating that a given dataset has not been tested in the literature on the corresponding model. Competitors span a variety of approaches for graph classification, including classifiers working on the top of pure graph matching similarities [72], [74], kernel methods [77], [80] and several embedding techniques [75], [76], including GrC-based [5], [6], [33] and neural ones [78], [79]. As regards subsampling-based implementations, in Table 1 are reported the performances obtained at different subsampling rates in the form of min-max range.…”
Section: Comparison Against Current Approachesmentioning
confidence: 99%