2022
DOI: 10.1007/978-3-031-21534-6_1
|View full text |Cite
|
Sign up to set email alerts
|

Algorithms for Large-Scale Network Analysis and the NetworKit Toolkit

Abstract: The abundance of massive network data in a plethora of applications makes scalable analysis algorithms and software tools necessary to generate knowledge from such data in reasonable time. Addressing scalability as well as other requirements such as good usability and a rich feature set, the open-source software NetworKit has established itself as a popular tool for large-scale network analysis. This chapter provides a brief overview of the contributions to NetworKit made by the SPP 1736. Algorithmic contribut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 74 publications
0
2
0
Order By: Relevance
“…In Table 1, the implemented hyperparameter values and associated Python packages are detailed. The NetworKit [48] package was used for the heuristic-based link prediction methods. The experiments in this study were conducted on a MacBook Pro equipped with an Apple M1 chip and 8 GB of memory.…”
Section: Selection Of Link Prediction Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In Table 1, the implemented hyperparameter values and associated Python packages are detailed. The NetworKit [48] package was used for the heuristic-based link prediction methods. The experiments in this study were conducted on a MacBook Pro equipped with an Apple M1 chip and 8 GB of memory.…”
Section: Selection Of Link Prediction Methodsmentioning
confidence: 99%
“…This adopted approach is, to the best of the authors' knowledge, novel. Data preprocessing was performed using a combination of the NetworKit [48] and NetworkX [64] Python packages.…”
Section: Description Of Data Pre-processing Techniquesmentioning
confidence: 99%
“…For instance, when we applied an exact betweenness centrality computation to a test subgraph containing 712,687 nodes and 1,263,320 edges, it took more than 10 h to compute. Consequently, we turned to the approximate computation method proposed by Riondato and Kornaropoulos [30], implemented through Networkit [31]. This method employs random sampling of shortest paths to achieve an approximate betweenness centrality computation.…”
Section: Dropapimentioning
confidence: 99%
“…This ensures that the application nodes in the testing set do not appear in the training or validation sets, thereby preventing potential information leakage. [26] 3.3.5 Decompiled tool Pytorch [43] 1.12.1 Python package Pytorch Geometric [28] 2.1.0.post1 Python package Networkit [31] 10.1 Python package Pandas [44] 1.5.2 Python package Matplotlib [45] 3.5.2 Python package Neo4j [27] 4.4.17 Database…”
Section: Experiments Setupmentioning
confidence: 99%