2020
DOI: 10.1609/aaai.v34i04.5911
|View full text |Cite
|
Sign up to set email alerts
|

Learning Signed Network Embedding via Graph Attention

Abstract: Learning the low-dimensional representations of graphs (i.e., network embedding) plays a critical role in network analysis and facilitates many downstream tasks. Recently graph convolutional networks (GCNs) have revolutionized the field of network embedding, and led to state-of-the-art performance in network analysis tasks such as link prediction and node classification. Nevertheless, most of the existing GCN-based network embedding methods are proposed for unsigned networks. However, in the real world, some o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
42
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 92 publications
(48 citation statements)
references
References 21 publications
(37 reference statements)
0
42
0
Order By: Relevance
“…[34] proposes an analysis that provides insights into better extracting and fusing information from the protein-protein interaction network for drug repurposing. Furthermore, [7,14,15,20,35] designed representation learning models [29,36] that are able to preserve both positive and negative link information within the signed graphs. These methodologies are lacking the capability to fully exploit the structural and attribute information that lies in the signed bipartite graphs.…”
Section: Gnns On Bipartite Graphmentioning
confidence: 99%
“…[34] proposes an analysis that provides insights into better extracting and fusing information from the protein-protein interaction network for drug repurposing. Furthermore, [7,14,15,20,35] designed representation learning models [29,36] that are able to preserve both positive and negative link information within the signed graphs. These methodologies are lacking the capability to fully exploit the structural and attribute information that lies in the signed bipartite graphs.…”
Section: Gnns On Bipartite Graphmentioning
confidence: 99%
“…Considering interactions between positive and negative edges jointly is another main inspiration for our method, but SSSNET is not driven by such social balance theory principles. Many other GNNs [26,25,34,11,32,55] are also based on social balance theory, usually applied to data with strong positive class imbalance. Numerous other signed network embedding methods [9,10,50,28,53] also do not explore the node clustering problem.…”
Section: Related Work 21 Network Embedding and Clusteringmentioning
confidence: 99%
“…The main novelty of our approach is a new take on the role of social balance theory for signed network embeddings. The standard heuristic for justifying the criteria for the embeddings hinges on the assumption that "an enemy's enemy is a friend" [53,10,17,34,25,26]. This heuristic is based on social balance theory [22,42], or multiplicative distrust propagation as in [20], which asserts that in a social network, in a triangle either all three nodes are friends, or two friends have a common enemy; otherwise it would be viewed as unbalanced.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, with the wide adoption of data mining and machine learning from various domains, techniques for signed network analysis have evolved from observations, measurements to mining tasks [8,23,28]. Among these techniques, signed network embedding [4,9,12,17,25], which learns a low-dimension vector representation for each node in signed networks, is one of the most promising one for it can automatically extract features and can employ various kinds of data mining and machine learning models to perform signed network analysis tasks such as link sign prediction.…”
Section: Introductionmentioning
confidence: 99%
“…Some earlier approaches derive normalized spectral analysis [29], adopt the log-bilinear model [10], or treat both positive and negative neighbors the same during the aggregation process [16]. Later, graph neural networks [14] are introduced [4,17] to aggregate information from neighbor nodes.…”
Section: Introductionmentioning
confidence: 99%