We are concerned with the computation of the L∞-norm for an L∞-function of the form H(s) = C(s)D(s) −1 B(s), where the middle factor is the inverse of a meromorphic matrix-valued function, and C(s), B(s) are meromorphic functions mapping to shortand-fat and tall-and-skinny matrices, respectively. For instance, transfer functions of descriptor systems and delay systems fall into this family. We focus on the case where the middle factor is large-scale. We propose a subspace projection method to obtain approximations of the function H where the middle factor is of much smaller dimension. The L∞-norms are computed for the resulting reduced functions, then the subspaces are refined by means of the optimal points on the imaginary axis where the L∞-norm of the reduced function is attained. The subspace method is designed so that certain Hermite interpolation properties hold between the largest singular values of the original and reduced functions. This leads to a locally superlinearly convergent algorithm with respect to the subspace dimension, which we prove and illustrate on various numerical examples.
This work concerns the global minimization of a prescribed eigenvalue or a
weighted sum of prescribed eigenvalues of a Hermitian matrix-valued function
depending on its parameters analytically in a box. We describe how the
analytical properties of eigenvalue functions can be put into use to derive
piece-wise quadratic functions that underestimate the eigenvalue functions.
These piece-wise quadratic under-estimators lead us to a global minimization
algorithm, originally due to Breiman and Cutler. We prove the global
convergence of the algorithm, and show that it can be effectively used for the
minimization of extreme eigenvalues, e.g., the largest eigenvalue or the sum of
the largest specified number of eigenvalues. This is particularly facilitated
by the analytical formulas for the first derivatives of eigenvalues, as well as
analytical lower bounds on the second derivatives that can be deduced for
extreme eigenvalue functions. The applications that we have in mind also
include the ${\rm H}_\infty$-norm of a linear dynamical system, numerical
radius, distance to uncontrollability and various other non-convex eigenvalue
optimization problems, for which, generically, the eigenvalue function involved
is simple at all points.Comment: 25 pages, 3 figure
Two useful measures of the robust stability of the discrete-time dynamical system x k+1 = Ax k are the-pseudospectral radius and the numerical radius of A. The-pseudospectral radius of A is the largest of the moduli of the points in the-pseudospectrum of A, while the numerical radius is the largest of the moduli of the points in the field of values. We present globally convergent algorithms for computing the-pseudospectral radius and the numerical radius. For the former algorithm, we discuss conditions under which it is quadratically convergent and provide a detailed accuracy analysis giving conditions under which the algorithm is backward stable. The algorithms are inspired by methods of Byers, Boyd-Balakrishnan, He-Watson and Burke-Lewis-Overton for related problems and depend on computing eigenvalues of symplectic pencils and Hamiltonian matrices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.