2022
DOI: 10.48550/arxiv.2201.07402
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Flexible Parallel Learning in Edge Scenarios: Communication, Computational and Energy Cost

Abstract: Traditionally, distributed machine learning takes the guise of (i) different nodes training the same model (as in federated learning), or (ii) one model being split among multiple nodes (as in distributed stochastic gradient descent). In this work, we highlight how fog-and IoT-based scenarios often require combining both approaches, and we present a framework for flexible parallel learning (FPL), achieving both data and model parallelism. Further, we investigate how different ways of distributing and paralleli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 15 publications
(24 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?