2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00010
|View full text |Cite
|
Sign up to set email alerts
|

Edge-Labeling Graph Neural Network for Few-Shot Learning

Abstract: In this paper, we propose a novel edge-labeling graph neural network (EGNN), which adapts a deep neural network on the edge-labeling graph, for few-shot learning. The previous graph neural network (GNN) approaches in few-shot learning have been based on the node-labeling framework, which implicitly models the intra-cluster similarity and the inter-cluster dissimilarity. In contrast, the proposed EGNN learns to predict the edge-labels rather than the node-labels on the graph that enables the evolution of an exp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
282
0
3

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 408 publications
(285 citation statements)
references
References 13 publications
0
282
0
3
Order By: Relevance
“…Moreover, PRN designs a novel loss function, which takes both inter-class and intra-class distance into account. References [26] and [27] use the graph convolutional network (GNN) to solve the few-shot recognition problem. According to the similarity between nodes, GNN selectively spreads the image information of the existing label to the test image that is most similar to it.…”
Section: B Metric Learning-based Methodsmentioning
confidence: 99%
“…Moreover, PRN designs a novel loss function, which takes both inter-class and intra-class distance into account. References [26] and [27] use the graph convolutional network (GNN) to solve the few-shot recognition problem. According to the similarity between nodes, GNN selectively spreads the image information of the existing label to the test image that is most similar to it.…”
Section: B Metric Learning-based Methodsmentioning
confidence: 99%
“…With the strong abstracting power of the deep neural network [8,13,15,16,22], some deep HDA methods are developed to further the performance boundary of HDA tasks. Weakly-shared Deep Transfer Network (DTN) [28] is one of the early attempts in the deep HDA area, which constructs multiple weakly-shared layers in the neural networks.…”
Section: Related Workmentioning
confidence: 99%
“…GAM (Stretcu et al 2019) uses an agreement model that calculates the probability of two nodes sharing the same label on the graph. EGNN (Kim et al 2019) adapts a deep neural network to predict the edge-labels rather than the node-labels on the graph. However, most of these methods are not suitable for online learning, because they need predefined topological information or whole training data repeatedly.…”
Section: Related Workmentioning
confidence: 99%