2018
DOI: 10.1145/3298989
|View full text |Cite
|
Sign up to set email alerts
|

PowerLyra

Abstract: Natural graphs with skewed distributions raise unique challenges to distributed graph computation and partitioning. Existing graph-parallel systems usually use a “one-size-fits-all” design that uniformly processes all vertices, which either suffer from notable load imbalance and high contention for high-degree vertices (e.g., Pregel and GraphLab) or incur high communication cost and memory consumption even for low-degree vertices (e.g., PowerGraph and GraphX). In this article, we argue that skewed distribution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 88 publications
(8 citation statements)
references
References 81 publications
0
5
0
Order By: Relevance
“…Cut-off points of .01, .05, and .08 have been suggested, corresponding to excellent, good, and mediocre fits respectively (MacCallum, Browne and Sugawara, 1996); confidence intervals should be used to understand the size of sampling error (upper-bound should preferably be < .1). For many of our models presented in this paper, the RMSEA index is only in the range of "good", however this is mostly due to the fact that -irrespective of the size of the data -scales with less than 10 items give inflated RMSEA values, especially if the scale has relatively high factor-loadings (Chen et al, 2019).…”
Section: Sem Evaluation Criteriamentioning
confidence: 92%
“…Cut-off points of .01, .05, and .08 have been suggested, corresponding to excellent, good, and mediocre fits respectively (MacCallum, Browne and Sugawara, 1996); confidence intervals should be used to understand the size of sampling error (upper-bound should preferably be < .1). For many of our models presented in this paper, the RMSEA index is only in the range of "good", however this is mostly due to the fact that -irrespective of the size of the data -scales with less than 10 items give inflated RMSEA values, especially if the scale has relatively high factor-loadings (Chen et al, 2019).…”
Section: Sem Evaluation Criteriamentioning
confidence: 92%
“…Static graph-partitioning algorithms require access to the global data of the graph before partitioning, and typically employ spectral methods, heuristic methods, or multi-level partitioning methods to partition the graph data. For instance, the classic algorithm Metis [18] utilizes a multi-level partitioning strategy, while other methods, such as NE [4] and PowerLyra [19], use vertex-cut and hybrid-cut, respectively. By contrast, dynamic graph partitioning, also known as streaming graph-partitioning algorithms, does not require access to the global data of the graph before partitioning.…”
Section: Balanced Graph Partitioningmentioning
confidence: 99%
“…The distributed GNNs are still in its infancy [2], with a few prior works on GPU-based systems. Compared to distributed systems for large graph analysis [6,10,11,28,32], the communication overhead in distributed GNNs are even more challenging, since the intensive data movement among workers to fetch neighbor embeddings is expensive. Currently, most existing systems utilized a centralized architecture.…”
Section: Related Workmentioning
confidence: 99%