2021
DOI: 10.1007/978-981-16-3420-8_2
|View full text |Cite
|
Sign up to set email alerts
|

Basics of Distributed Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(11 citation statements)
references
References 39 publications
0
8
0
Order By: Relevance
“…Importance sampling for efficient training. Samplingbased methods [42,43,14,30] aims to idendity a compact, yet representative subset of the training dataset that satisfies the original objectives for efficient model training. This is usually achieved through exploring the early training stage [43], constructing a proxy model [14], or utilizing consistency score (C-score) [30].…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Importance sampling for efficient training. Samplingbased methods [42,43,14,30] aims to idendity a compact, yet representative subset of the training dataset that satisfies the original objectives for efficient model training. This is usually achieved through exploring the early training stage [43], constructing a proxy model [14], or utilizing consistency score (C-score) [30].…”
Section: Related Workmentioning
confidence: 99%
“…Samplingbased methods [42,43,14,30] aims to idendity a compact, yet representative subset of the training dataset that satisfies the original objectives for efficient model training. This is usually achieved through exploring the early training stage [43], constructing a proxy model [14], or utilizing consistency score (C-score) [30]. However, the empirical performance gap between sampling-based methods and the baseline approach of random selection is insignificant, particularly in large-scale datasets like ImageNet-1k (See Fig.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations