2021
DOI: 10.1007/s10994-021-06014-6
|View full text |Cite
|
Sign up to set email alerts
|

MODES: model-based optimization on distributed embedded systems

Abstract: The predictive performance of a machine learning model highly depends on the corresponding hyper-parameter setting. Hence, hyper-parameter tuning is often indispensable. Normally such tuning requires the dedicated machine learning model to be trained and evaluated on centralized data to obtain a performance estimate. However, in a distributed machine learning scenario, it is not always possible to collect all the data from all nodes due to privacy concerns or storage limitations. Moreover, if data has to be tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…These do not only discuss straightforward classification tasks, but also attempt to solve other problems. Other subjects discussed included learning or sharing various forms of information in a privacy preserving manner, such as; generating synthetic data which can be used for further analyses without fear of leaking privacy sensitive data (Fioretto and Van Hentenryck, 2019), sharing labels (Martin and Zhu, 2021), learning hyperparameters (Shi et al, 2021), and fulfilling a segmentation task (Fay et al, 2020). Furthermore, there are publications focused on providing protection against various attacks (Cao et al, 2021).…”
Section: Federated Ensemblesmentioning
confidence: 99%
“…These do not only discuss straightforward classification tasks, but also attempt to solve other problems. Other subjects discussed included learning or sharing various forms of information in a privacy preserving manner, such as; generating synthetic data which can be used for further analyses without fear of leaking privacy sensitive data (Fioretto and Van Hentenryck, 2019), sharing labels (Martin and Zhu, 2021), learning hyperparameters (Shi et al, 2021), and fulfilling a segmentation task (Fay et al, 2020). Furthermore, there are publications focused on providing protection against various attacks (Cao et al, 2021).…”
Section: Federated Ensemblesmentioning
confidence: 99%
“…Some scholars applied the full convolution network technology to the three-dimensional scanning data to combine the threedimensional point dimension with the two-dimensional grid as the feature extraction method by forming different 2D endto-end full convolution networks through the candidate regions and detect the vehicle target and frame and achieved good results. Scholars of vehicle technology proposed to design a feature convolution kernel library composed of multiple forms and color Gabor [11] to train and screen the optimal feature extraction convolution kernel group by replacing the low-level convolution kernel group of the original network, so as to improve the detection accuracy [12]. In the upgrading iteration from single image detection to multitarget image detection, the adaptive threshold strategy is added to reduce the missed alarm rate and false alarm rate and realize the target detection in complex traffic scenes [13].…”
Section: Introduction E Introduction Of Smart Internet Ofmentioning
confidence: 99%
“…Evaluation of model‐based optimization (MBO) on distributed embedded system with two different datasets using two different machine‐learning algorithms has been achieved. However, they require high‐end graphic processing units (GPUs) (Shi et al, 2021). Recently, neural network implementation for resource‐constrained embedded systems (ARM Cortex‐M) has been reported, but it lacks the on‐board training capabilities.…”
Section: Introductionmentioning
confidence: 99%