We introduce a new family of strong linearizations of matrix polynomials-which we call "block Kronecker pencils"-and perform a backward stability analysis of complete polynomial eigenproblems. These problems are solved by applying any backward stable algorithm to a block Kronecker pencil, such as the staircase algorithm for singular pencils or the QZ algorithm for regular pencils. This stability analysis allows us to identify those block Kronecker pencils that yield a computed complete eigenstructure which is exactly that of a slightly perturbed matrix polynomial. The global backward error analysis in this work presents for the first time the following key properties: it is a rigurous analysis valid for finite perturbations (i.e., it is not a first order analysis), it provides precise bounds, it is valid simultaneously for a large class of linearizations, and it establishes a framework that may be generalized to other classes of linearizations. These features are related to the fact that block Kronecker pencils are a particular case of the new family of "strong block minimal bases pencils", which are robust under certain perturbations and, so, include certain perturbations of block Kronecker pencils. We hope that this robustness property will allow us to extend the results in this paper to other contexts.
The concept of linearization is fundamental for theory, applications, and spectral computations related to matrix polynomials. However, recent research on several important classes of structured matrix polynomials arising in applications has revealed that the strategy of using linearizations to develop structure-preserving numerical algorithms that compute the eigenvalues of structured matrix polynomials can be too restrictive, because some structured polynomials do not have any linearization with the same structure. This phenomenon strongly suggests that linearizations should sometimes be replaced by other low degree matrix polynomials in applied numerical computations. Motivated by this fact, we introduce equivalence relations that allow the possibility of matrix polynomials (with coefficients in an arbitrary field) to be equivalent, with the same spectral structure, but have different sizes and degrees. These equivalence relations are directly modeled on the notion of linearization, and consequently inherit the simplicity, applicability, and most relevant properties of linearizations; simultaneously, though, they are much more flexible in the possible degrees of equivalent polynomials. This flexibility allows us to define in a unified way the notions of quadratification and -ification, to introduce the concept of companion form of arbitrary degree, and to provide concrete and simple examples of these notions that generalize in a natural and smooth way the classical first and second Frobenius companion forms. The properties of -ifications are studied in depth; in this process a fundamental result on matrix polynomials, the "Index Sum Theorem", is recovered and extended to arbitrary fields. Although this result is known in the systems theory literature for real matrix polynomials, it has remained unnoticed by many researchers. It establishes that the sum of the (finite and infinite) partial multiplicities, together with the (left and right) minimal indices of any matrix polynomial is equal to the rank times the degree of the polynomial. The "Index Sum Theorem" turns out to be a key tool for obtaining a number of significant results: on the possible sizes and degrees of -ifications and companion forms, on the minimal index preservation properties of companion forms of arbitrary degree, as well as on obstructions to the existence of structured companion forms for structured matrix polynomials of even degree. This paper presents many new results, blended together with results already known in the literature but extended here to the most general setting of matrix polynomials of arbitrary sizes and degrees over arbitrary fields. Therefore we have written the paper in an expository and self-contained style that makes it accessible to a wide variety of readers.
A standard way of dealing with a matrix polynomial P (λ) is to convert it into an equivalent matrix pencil-a process known as linearization. For any regular matrix polynomial, a new family of linearizations generalizing the classical first and second Frobenius companion forms has recently been introduced by Antoniou and Vologiannidis, extending some linearizations previously defined by Fiedler for scalar polynomials. We prove that these pencils are linearizations even when P (λ) is a singular square matrix polynomial, and show explicitly how to recover the left and right minimal indices and minimal bases of the polynomial P (λ) from the minimal indices and bases of these linearizations. In addition, we provide a simple way to recover the eigenvectors of a regular polynomial from those of any of these linearizations, without any computational cost. The existence of an eigenvector recovery procedure is essential for a linearization to be relevant for applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.