Proceedings of the 23rd International Systems and Software Product Line Conference - Volume A 2019
DOI: 10.1145/3336294.3336306
|View full text |Cite
|
Sign up to set email alerts
|

Automated Search for Configurations of Convolutional Neural Network Architectures

Abstract: embourg Deep Neural Networks (DNNs) are intensively used to solve a wide variety of complex problems. Although powerful, such systems require manual configuration and tuning. To this end, we view DNNs as configurable systems and propose an end-to-end framework that allows the configuration, evaluation and automated search for DNN architectures. Therefore, our contribution is threefold. First, we model the variability of DNN architectures with a Feature Model (FM) that generalizes over existing architectures. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…Both achieve +80% accuracy with Adam Optimizer, data-augmentation and 50 training epochs. The remaining 100 models were produced by FeatureNet [38], a neural architecture search tool that can generate a predefined number of models while maximizing diversity.…”
Section: B Experiments Subjectsmentioning
confidence: 99%
“…Both achieve +80% accuracy with Adam Optimizer, data-augmentation and 50 training epochs. The remaining 100 models were produced by FeatureNet [38], a neural architecture search tool that can generate a predefined number of models while maximizing diversity.…”
Section: B Experiments Subjectsmentioning
confidence: 99%
“…Applications include specifying software product lines [1,20] and predicting systems' performance [18,19]. Conversely, variability management techniques can support the design of machine learning models [8].…”
Section: Introductionmentioning
confidence: 99%
“…In Stein et al., 62 an automated Efficient Global Optimization (EGO) method for exploration of CNNs settings is introduced, where, within image classification, an emphasis is placed on the prediction error minimization. To determine an estimate of the optimal DNN settings, Ghamizi et al 63 . suggested the utilization of Feature Model (FM), where each admissible configuration of the FM leads to an admissible DNN structure.…”
Section: Introductionmentioning
confidence: 99%