“…In essence, the design steps of a neural network architecture that might otherwise be done by an engineer or graduate student by hand are instead automated and optimized as part of a well defined search space of reasonable layers, connections, outputs, and hyperparameters. In fact, architecture search can itself be defined in terms of hyperparameters [12] or as a graph search problem [27,19,2,24]. Furthermore, once a search space is defined various tools can be brought to bear on the problem including Bayesian optimization [16], other neural networks [1], reinforcement learning, evolution [21,20], or a wide variety of optimization frameworks.…”