We consider the minimization or maximization of the Jth largest eigenvalue of an analytic and Hermitian matrix-valued function, and build on Mengi et al. (2014, SIAM J. Matrix Anal. Appl., 35, 699-724). This work addresses the setting when the matrix-valued function involved is very large. We describe subspace procedures that convert the original problem into a small-scale one by means of orthogonal projections and restrictions to certain subspaces, and that gradually expand these subspaces based on the optimal solutions of small-scale problems. Global convergence and superlinear rate-of-convergence results with respect to the dimensions of the subspaces are presented in the infinite dimensional setting, where the matrix-valued function is replaced by a compact operator depending on parameters. In practice, it suffices to solve eigenvalue optimization problems involving matrices with sizes on the scale of tens, instead of the original problem involving matrices with sizes on the scale of thousands.
Nonsmoothness at optimal points is a common phenomenon in many eigenvalue optimization problems. We consider two recent algorithms to minimize the largest eigenvalue of a Hermitian matrix dependent on one parameter, both proven to be globally convergent unaffected by nonsmoothness. One of these algorithms models the eigenvalue function with a piece-wise quadratic function and is effective in dealing with nonconvex problems. The other algorithm projects the Hermitian matrix into subspaces formed of eigenvectors and is effective in dealing with large-scale problems. We generalize the latter slightly to cope with nonsmoothness. For both algorithms we analyze the rate of convergence in the nonsmooth setting, when the largest eigenvalue is multiple at the minimizer and zero is strictly in the interior of the generalized Clarke derivative, and prove that both algorithms converge rapidly. The algorithms are applied to, and the deduced results are illustrated on the computation of the inner numerical radius, the modulus of the point on the boundary of the field of values closest to the origin, which carries significance for instance for the numerical solution of a symmetric definite generalized eigenvalue problem and the iterative solution of a saddle point linear system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.