Proceedings of the 51st Hawaii International Conference on System Sciences 2018
DOI: 10.24251/hicss.2018.279
|View full text |Cite
|
Sign up to set email alerts
|

Multivariate Stochastic Approximation to Tune Neural Network Hyperparameters for Criticial Infrastructure Communication Device Identification

Abstract: The

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
22
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(22 citation statements)
references
References 38 publications
0
22
0
Order By: Relevance
“…a coder experientially finding settings that "just work," or 3) random searches which use random seeds (notably a competitive method). Grid searches involve creating an experimental design where design points are explored and then one uses either a spreadsheet search or a response surface method to find suitable operating points [21].…”
Section: Ai Hyperparameter Determinationmentioning
confidence: 99%
See 1 more Smart Citation
“…a coder experientially finding settings that "just work," or 3) random searches which use random seeds (notably a competitive method). Grid searches involve creating an experimental design where design points are explored and then one uses either a spreadsheet search or a response surface method to find suitable operating points [21].…”
Section: Ai Hyperparameter Determinationmentioning
confidence: 99%
“…• Stochastic Approximation [21], hill climbing where hyperparameters are individually and sequentially changed • Evolutionary algorithms [20], which randomly start, select the best initial results (parents), and then generate multiple possible outcomes (children), and then repeat the process • Bayesian optimization (BO) [5] which treats the objective function as a random function and uses randomly determined hyperparameters to construct a distribution around the results • Other approaches which do not fit cleanly into these three groups, e.g. Radial Basis Functions [22], Hyberband [23], Nedler-Mead [24], and spectral approaches [25].…”
Section: Ai Hyperparameter Determinationmentioning
confidence: 99%
“…Notably, GRLVQI and machine learning algorithms in general are highly sensitive to hyperparameter settings, such as learning rates and architecture size [28]. Although work has considered finding optimal settings for such algorithms, e.g., GRLVQI-SD, GRLVQI with optimized hyperparameters for Stochastic Optimization via Sequential Design of [28], such approaches are computationally costly with dozens of iterations needed to obtain improved algorithm settings. Additionally, such highly tuned hyperparameter values are often specific to the scope of the data and thus not useable on other datasets.…”
Section: Classification Algorithms For Rf Fingerprintingmentioning
confidence: 99%
“…Classifier models were developed for the Z-wave dataset using four classifiers: (1) the proposed Cosine GRLVQI (Section 3.2), (2) MDA, (3) the baseline GRLVQI of Harmer et al [6], and (4) the GRLVQI-SD of Bihl and Steeneck [28]. Consistent with [24,62], for these classifiers, the following process of Algorithm 1 was employed for each SNR.…”
Section: Classifier Algorithm Performancementioning
confidence: 99%
See 1 more Smart Citation