Optimization problems with constraints which require the solution of a partial differential equation arise widely in many areas of the sciences and engineering, in particular in problems of design. The solution of such PDE-constrained optimization problems is usually a major computational task. Here we consider simple problems of this type: distributed control problems in which the 2-and 3-dimensional Poisson problem is the PDE. The large dimensional linear systems which result from discretization and which need to be solved are of saddle-point type. We introduce two optimal preconditioners for these systems which lead to convergence of symmetric Krylov subspace iterative methods in a number of iterations which does not increase with the dimension of the discrete problem. These preconditioners are block structured and involve standard multigrid cycles. The optimality of the preconditioned iterative solver is proved theoretically and verified computationally in several test cases. The theoretical proof indicates that these approaches may have much broader applicability for other partial differential equations.
In this paper we investigate the possibility of using a block-triangular preconditioner for saddle point problems arising in PDE-constrained optimization. In particular, we focus on a conjugate gradient-type method introduced by Bramble and Pasciak that uses self-adjointness of the preconditioned system in a non-standard inner product. We show when the Chebyshev semi-iteration is used as a preconditioner for the relevant matrix blocks involving the finite element mass matrix that the main drawback of the Bramble-Pasciak method-the appropriate scaling of the preconditioners-is easily overcome. We present an eigenvalue analysis for the block-triangular preconditioners that gives convergence bounds in the nonstandard inner product and illustrates their competitiveness on a number of computed examples. BLOCK TRIANGULAR PDE OPTIMIZATION 979The state y and the control u are linked via a partial differential equation-we consider the Poisson equation here that are well suited for this purpose. For more details on the derivation of (9), we refer to [10]. Note that for notational convenience we will switch between the 3×3 and 2×2 block structure of
Solving problems regarding the optimal control of partial differential equations (PDEs)-also known as PDE-constrained optimization-is a frontier area of numerical analysis. Of particular interest is the problem of flow control, where one would like to effect some desired flow by exerting, for example, an external force. The bottleneck in many current algorithms is the solution of the optimality system-a system of equations in saddle point form that is usually very large and ill conditioned. In this paper we describe two preconditioners-a block diagonal preconditioner for the minimal residual method and a block lower-triangular preconditioner for a nonstandard conjugate gradient method-which can be effective when applied to such problems where the PDEs are the Stokes equations. We consider only distributed control here, although we believe other problems could be treated in the same way. We give numerical results, and we compare these with those obtained by solving the equivalent forward problem using similar techniques.
Projected Krylov methods are full-space formulations of Krylov methods that take place in a nullspace. Provided projections into the nullspace can be computed accurately, those methods only require products between an operator and vectors lying in the nullspace. We provide systematic principles for obtaining the projected form of any well-defined Krylov method. Projected Krylov methods are mathematically equivalent to constraint-preconditioned Krylov methods provided the initial guess is well chosen, but require less memory. As a consequence, there are situations where certain known methods such as MINRES and SYMMLQ are well defined in the presence of an indefinite preconditioner.
Abstract. We propose a variant of GMRES, where multiple (two or more) preconditioners are applied simultaneously, while maintaining minimal residual optimality properties. To accomplish this, a block version of Flexible GMRES is used, but instead of considering blocks associated with multiple right hand sides, we consider a single right-hand side and grow the space by applying each of the preconditioners to all current search directions, minimizing the residual norm over the resulting larger subspace. To alleviate the difficulty of rapidly increasing storage requirements, we present a heuristic limited-memory selective algorithm, and demonstrate the effectiveness of this approach. Numerical results for problems in PDE-constrained optimization and fluid flow problems are presented, illustrating the viability and the potential of the proposed method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.