2012
DOI: 10.48550/arxiv.1204.6078
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed GraphLab: A Framework for Machine Learning in the Cloud

Abstract: While high-level data parallel frameworks, like MapReduce, simplify the design and implementation of large-scale data processing systems, they do not naturally or efficiently support many important data mining and machine learning algorithms and can lead to inefficient learning systems. To help fill this critical void, we introduced the GraphLab abstraction which naturally expresses asynchronous, dynamic, graph-parallel computation while ensuring data consistency and achieving a high degree of parallel perform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 17 publications
(22 reference statements)
0
12
0
Order By: Relevance
“…Incorporating graph technology and the abundance of dissimilar graph datasets have assisted in building quite sophisticated graph analytics tools. Despite the effectiveness of the conventional graph analysis approaches, such as Graphx [30], Gephi [31], GraphLab [32] to name a few, graph embedding has notably improved the efficiency of conducting graph analytics aby converting the graph to a low semantic dimensional space, thus information can be represented as vectors leading to computational efficiency. Several efforts have been conducted to incorporate KG Embeddings to address numerous NLP challenges.…”
Section: Nlp Applications Using Kgementioning
confidence: 99%
“…Incorporating graph technology and the abundance of dissimilar graph datasets have assisted in building quite sophisticated graph analytics tools. Despite the effectiveness of the conventional graph analysis approaches, such as Graphx [30], Gephi [31], GraphLab [32] to name a few, graph embedding has notably improved the efficiency of conducting graph analytics aby converting the graph to a low semantic dimensional space, thus information can be represented as vectors leading to computational efficiency. Several efforts have been conducted to incorporate KG Embeddings to address numerous NLP challenges.…”
Section: Nlp Applications Using Kgementioning
confidence: 99%
“…Many distributed graph processing systems such as Pregel [25], GraphLab [24], and PowerGraph [17], have been proposed to…”
Section: Graph Preprocessing and Transformationmentioning
confidence: 99%
“…Sparsity is a key enabler of future Artificial Intelligence [3][4][5]. Sparsity enables fast and energy-efficient training and inference in various domains [6][7][8][9].…”
Section: Introductionmentioning
confidence: 99%