2021
DOI: 10.1109/tkde.2020.2971490
|View full text |Cite
|
Sign up to set email alerts
|

Network Embedding With Completely-Imbalanced Labels

Abstract: Network embedding, aiming to project a network into a low-dimensional space, is increasingly becoming a focus of network research. Semi-supervised network embedding takes advantage of labeled data, and has shown promising performance. However, existing semi-supervised methods would get unappealing results in the completely-imbalanced label setting where some classes have no labeled nodes at all. To alleviate this, we propose two novel semi-supervised network embedding methods. The first one is a shallow method… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
41
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
3
1

Relationship

2
8

Authors

Journals

citations
Cited by 43 publications
(41 citation statements)
references
References 53 publications
0
41
0
Order By: Relevance
“…The recommendation and one-click collection functions are also provided in the system. In the future, we will design more applications by incorporating other fields [34][35][36][37] based on the system.…”
Section: Discussionmentioning
confidence: 99%
“…The recommendation and one-click collection functions are also provided in the system. In the future, we will design more applications by incorporating other fields [34][35][36][37] based on the system.…”
Section: Discussionmentioning
confidence: 99%
“…Another well-known work is LINE which preserves both first-order proximity (i.e., the similarity between linked nodes) and second-order proximity (i.e., the similarity between the nodes with shared neighbors) of a network. In addition, researchers have also proposed some deep learning-based embedding models, such as SDNE [12] and [17] and RECT [18] further consider the problem of zeroshot graph embedding, i.e., the completely imbalanced label setting.…”
Section: Related Workmentioning
confidence: 99%
“…GG-NN [25] designs a gate mechanism-based aggregation function, which provides a weighted average of the messages from neighbors and the center node. Besides, there are some works [26,27] considering the imbalanced nodes in representation. Although these GNNs can characterize the node structure and learn the representations of nodes, most of them suffer from the over-smoothing problem.…”
Section: Related Workmentioning
confidence: 99%