Proceedings of the ACM Web Conference 2022 2022
DOI: 10.1145/3485447.3511982
|View full text |Cite
|
Sign up to set email alerts
|

Fograph: Enabling Real-Time Deep Graph Inference with Fog Computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 52 publications
0
4
0
Order By: Relevance
“…Distributed GNN systems. To support efficient GNN processing, a few frameworks have been developed in both research [6], [36]- [41] and industry communities [11], [12], [27], [42]- [44] to optimize performance at different levels. Towards general kernel-level optimizations, several works [12], [44], [45] endeavor to abstract a user-friendly programming model, and then apply a set of techniques under the unified interfaces by exploiting the execution primitives of GNN models.…”
Section: B Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Distributed GNN systems. To support efficient GNN processing, a few frameworks have been developed in both research [6], [36]- [41] and industry communities [11], [12], [27], [42]- [44] to optimize performance at different levels. Towards general kernel-level optimizations, several works [12], [44], [45] endeavor to abstract a user-friendly programming model, and then apply a set of techniques under the unified interfaces by exploiting the execution primitives of GNN models.…”
Section: B Related Workmentioning
confidence: 99%
“…data collection) cannot be omitted. To unleash the architectural benefits of edge computing, Fograph [41] first investigates GNN processing with vicinal fog servers, and proposes a distributed inference system for real-time serving. Yet it merely pursues performance indicators on latency and throughput, where a comprehensive cost model for DGPE still lacks, demanding a new formulation for analysis.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Firstly, available resources may be limited. It is common to perform heavy operations in the fog such as ML inference [7] and real-time video analysis [8]. Hence, such…”
Section: Introductionmentioning
confidence: 99%