Deep Learning has enabled remarkable progress over the last years on a variety of tasks, such as image recognition, speech recognition, and machine translation. One crucial aspect for this progress are novel neural architectures. Currently employed architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. Because of this, there is growing interest in automated neural architecture search methods. We provide an overview of existing work in this field of research and categorize them according to three dimensions: search space, search strategy, and performance estimation strategy.
Neural Architecture Search aims at automatically finding neural network architectures that are competitive with architectures designed by human experts. While recent approaches have achieved state-of-the-art predictive performance for, e.g., image recognition, they are problematic under resource constraints for two reasons:(1) the neural architectures found are solely optimized for high predictive performance, without penalizing excessive resource consumption; (2) most architecture search methods require vast computational resources. We address the first shortcoming by proposing LEMONADE, an evolutionary algorithm for multi-objective architecture search that allows approximating the entire Pareto front of architectures under multiple objectives, such as predictive performance and number of parameters, in a single run of the method. We address the second shortcoming by proposing a Lamarckian inheritance mechanism for LEMONADE which generates child networks that are warm started with the predictive performance of their trained parents. This is accomplished by using (approximate) network morphism operators for generating children. The combination of these two contributions allows finding models that are on par or even outperform both hand-crafted as well as automatically-designed networks.
Abstract. Let Ω ε ⊂ R M +1 , 0 < ε ≤ 1, be a net-shaped Lipschitz domain which collapses to a one-dimensional net as ε ↓ 0. On Ω ε we consider the equation u t = ∆u with von Neumann boundary conditions. We show under quite general conditions that the semiflows generated by this equation have a limit in a strong sense, the limit semiflow being generated by an abstract linear operator. Also, under an additional assumption, the eigenvalues and eigenfunctions of the corresponding operators converge. This allows us to apply the techniques in [14] to prove the convergence of the nonlinear semiflows generated by a reaction-diffusion equation on Ω ε and the upper-semicontinuity of their attractors at ε = 0. Our technique also allows us to treat the case that Ω ε is smooth and has holes which vanish of order at least ε in all directions.
In this work we consider a dissipative reaction-diffusion equation in a d-dimensional thin domain shrinking to a one dimensional segment and obtain good rates for the convergence of the attractors. To accomplish this, we use estimates on the convergence of inertial manifolds as developed previously in [7] and Shadowing theory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.