The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313446
|View full text |Cite
|
Sign up to set email alerts
|

NetSMF: Large-Scale Network Embedding as Sparse Matrix Factorization

Abstract: We study the problem of large-scale network embedding, which aims to learn latent representations for network mining applications. Previous research shows that 1) popular network embedding benchmarks, such as DeepWalk, are in essence implicitly factorizing a matrix with a closed form, and 2) the explicit factorization of such matrix generates more powerful embeddings than existing methods. However, directly constructing and factorizing this matrix-which is dense-is prohibitively expensive in terms of both time… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
134
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 151 publications
(134 citation statements)
references
References 39 publications
0
134
0
Order By: Relevance
“…The practicality is greatly restricted. In order to retain the performance of the NetMF algorithm and reduce the requirements for computing resources, NetSMF proposes to sparse the NetMF matrix [88], and keep the spectrum of the sparse matrix close to the original matrix, and then decompose the sparse matrix. Experiments show that this method can effectively improve the computational efficiency of the algorithm while retaining the spectrum information of the network.…”
Section: ) Knowledge Graph Completion Methods Based On Network Represmentioning
confidence: 99%
“…The practicality is greatly restricted. In order to retain the performance of the NetMF algorithm and reduce the requirements for computing resources, NetSMF proposes to sparse the NetMF matrix [88], and keep the spectrum of the sparse matrix close to the original matrix, and then decompose the sparse matrix. Experiments show that this method can effectively improve the computational efficiency of the algorithm while retaining the spectrum information of the network.…”
Section: ) Knowledge Graph Completion Methods Based On Network Represmentioning
confidence: 99%
“…Most previous matrix-based network embedding methods emphasize on the importance of high-order proximity but skip the element-wise normalization step for either better scalability or ease of analysis [25,26,38]. Is normalization of the node similarity matrix important?…”
Section: Similarity Matrix Constructionmentioning
confidence: 99%
“…However, GraRep is not scalable due to the high time complexity of both raising the transition matrix A to higher powers and taking the element-wise logarithm of A k , which is a dense n × n matrix. A few recent work thus propose to speed up the construction of such a node similarity matrix [25,26,39], which are inspired by the spectral graph theory. The basic idea is that if the top-h eigendecomposition of A is given by A = U h Λ h U ⊤ h , then A k can be approximated with U h Λ k h U ⊤ h [26,39].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…DeepWalk [27], LINE [35], and node2vec [10] are pioneering works that introduce deep learning techniques into network analysis to learn node embeddings. NetMF [29] gives a theoretical analysis of equivalence for the different network embedding algorithms, and later NetSMF [28] gives a scalable solution via sparsification. Nevertheless, they were designed to handle only the homogeneous network with single-typed nodes and edges.…”
Section: Introductionmentioning
confidence: 99%