2021
DOI: 10.3390/s21248435
|View full text |Cite
|
Sign up to set email alerts
|

Hyperparameter Optimization Techniques for Designing Software Sensors Based on Artificial Neural Networks

Abstract: Software sensors are playing an increasingly important role in current vehicle development. Such soft sensors can be based on both physical modeling and data-based modeling. Data-driven modeling is based on building a model purely on captured data which means that no system knowledge is required for the application. At the same time, hyperparameters have a particularly large influence on the quality of the model. These parameters influence the architecture and the training process of the machine learning algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…Each model was trained with different optimizers by varying the learning rates and the best-fit hyperparameters were chosen as the ones with the highest validation accuracy. Based on previous work in image classification, the four most suitable optimizers, namely, Adam, RMSProp, SGD, and Nadam were chosen for tuning each model [ 70 , 71 , 72 ]. The learning rate was varied from 10 −2 to 10 −5 to determine the best fit.…”
Section: Methodsmentioning
confidence: 99%
“…Each model was trained with different optimizers by varying the learning rates and the best-fit hyperparameters were chosen as the ones with the highest validation accuracy. Based on previous work in image classification, the four most suitable optimizers, namely, Adam, RMSProp, SGD, and Nadam were chosen for tuning each model [ 70 , 71 , 72 ]. The learning rate was varied from 10 −2 to 10 −5 to determine the best fit.…”
Section: Methodsmentioning
confidence: 99%
“…Meanwhile, van de Wiel et al [30] proposed fast hyperparameter tuning, and Meanti et al [31] proposed Efficient Hyperparameter tuning in the kernel of ridge regression based on cross-validation of data. Tuning hyperparameters on a neural network model using crossvalidation data, among others, was carried out by Blume et al [32], and also by Lainder et al [33]. Although there is controversy over the advantages and disadvantages of applying the cross-validation method to set up model hyperparameters, this method is systematic and fair.…”
Section: Related Workmentioning
confidence: 99%
“…Those two techniques are effective in prediction and classification. The application in [37], [68], [74], [80], [81], [83] used GS and RS as hyperparameter selection methods. The GS explored all hyperparameters combinations in the search space, but it is expensive and not efficient if the search space has high dimensions, while RS from its name selected hyperparameters randomly and considered as the most efficient way of searching the hyperparameters configurations [5], [96], [100].…”
Section: Multi-fidelity Algorithms [98]mentioning
confidence: 99%
“…The second problem accurse is filling in local minimum the most active hyperparameter in this case is momentum coefficient just like problem that showed in [73]. In [4], [78], [81], [83] the batch size hyper parameter was improved since it represents the number of the samples that have been processed prior to updating the model and the number of the complete passes through the whole training dataset in cases of a large dataset. Furthermore, the learning algorithm's dynamics are influenced by an important hyperparameter.…”
mentioning
confidence: 99%