In this work, we present a novel approach for solving stochastic shape optimization problems. Our method is the extension of the classical stochastic gradient method to infinite-dimensional shape manifolds. We prove convergence of the method on Riemannian manifolds and then make the connection to shape spaces. The method is demonstrated on a model shape optimization problem from interface identification. Uncertainty arises in the form of a random partial differential equation, where underlying probability distributions of the random coefficients and inputs are assumed to be known. We verify some conditions for convergence for the model problem and demonstrate the method numerically.
We propose a second-order total generalized variation (TGV) regularization for the reconstruction of the initial condition in variational data assimilation problems. After showing the equivalence between TGV-regularization and a Bayesian MAP estimator, we focus on the detailed study of the inviscid Burgers' data assimilation problem. Due to the difficult structure of the governing hyperbolic conservation law, we consider a discretize-then-optimize approach and rigorously derive a first-order optimality condition for the problem. For the numerical solution, we propose a globalized reduced Newton-type method together with a polynomial line-search strategy, and prove convergence of the algorithm to stationary points. The paper finishes with some numerical experiments where, among others, the performance of TGV-regularization compared to TV-regularization is tested.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.