Home
Last modified: Mon May 12 14:02:51 EDT 2014
Möbius Transformations of Matrix Polynomials (with D. Steven  Mackey, Christian Mehl and Volker Mehrmann).   To appear in Linear Algebra and its Applications.. Also available as Manchester Institute for Mathematical Sciences EPrint 2014.2
We discuss Möbius transformations for general matrix polynomials over arbitrary fields, analyzing their influence on regularity, rank, determinant, constructs such as compound matrices, and on structural features including sparsity and symmetry. Results on the preservation of spectral information contained in elementary divisors, partial multiplicity sequences, invariant pairs, and minimal indices are presented. The effect on canonical forms such as Smith forms and local Smith forms, on relationships of strict equivalence and spectral equivalence, and on the property of being a linearization or quadratification are investigated. We show that many important transformations are special instances of Möbius transformations, and analyze a Möbius connection between alternating and palindromic matrix polynomials. Finally, the use of Möbius transformations in solving polynomial inverse eigenproblems is illustrated.

Skew-symmetric Matrix Polynomials and Their Smith Forms (with D. Steven  Mackey, Christian Mehl and Volker Mehrmann).   Linear Algebra and its Applications. v.438:12, pp. 4625--4653, 2013.
Two canonical forms for skew-symmetric matrix polynomials over arbitrary fields are characterized --- the Smith form, and its skew-symmetric variant obtained via unimodular congruences. Applications include the analysis of the eigenvalue and elementary divisor structure of products of two skew-symmetric matrices, the derivation of a Smith-McMillan-like canonical form for skew-symmetric rational matrices, and the construction of minimal symmetric factorizations of skew-symmetric rational matrices. A sufficient condition for the existence of solutions to matrix polynomial Sylvester equations, and results on the existence and construction of structured linearizations for regular and singular skew-symmetric matrix polynomials are also presented.

Smith Forms of Palindromic Matrix Polynomials (with D. Steven  Mackey, Christian Mehl and Volker Mehrmann).   Electronic J. of Linear Algebra. v. 22, pp. 53--91, 2011.
Many applications give rise to matrix polynomials whose coefficients have a kind of reversal symmetry, a structure we call palindromic. Several properties of scalar palindromic polynomials are derived, and together with properties of compound matrices, used to establish the Smith form of regular and singular T-palindromic matrix polynomials, over arbitrary fields. The invariant polynomials are shown to inherit palindromicity, and their structure is described in detail. Jordan structures of palindromic matrix polynomials are characterized, and necessary conditions for the existence of structured linearizations established. In the odd degree case, a constructive procedure for building palindromic linearizations shows that the necessary conditions are sufficient as well. The Smith form for *-palindromic polynomials is also analyzed. Finally, results for palindromic matrix polynomials over fields of characteristic two are presented.

Jordan Structures of Alternating Matrix Polynomials (with D. Steven  Mackey, Christian Mehl and Volker Mehrmann).   Linear Algebra and its Applications, v. 432:4, pp. 867--891, 2010.
Alternating matrix polynomials, that is, polynomials whose coefficients alternate between symmetric and skew-symmetric matrices, generalize the notions of even and odd scalar polynomials. We investigate the Smith forms of alternating matrix polynomials, showing that each invariant factor is an even or odd scalar polynomial. Necessary and sufficient conditions are derived for a given Smith form to be that of an alternating matrix polynomial. These conditions allow a characterization of the possible Jordan structures of alternating matrix polynomials, and also lead to necessary and sufficient conditions for the existence of structure-preserving strong linearizations. Most of the results are applicable to singular as well as regular matrix polynomials.

On The Definition of Two Natural Classes of Scalar Product (with D. Steven  Mackey and Françoise Tisseur). MIMS EPrint 2007.64, Manchester Institute for Mathematical Sciences, Manchester, England. April 2007. Revised March 2009.
We identify two natural classes of scalar product, termed unitary and orthosymmetric, which serve to unify assumptions for the existence of structured factorizations, iterations and mappings. A variety of different characterizations of these scalar product classes is given. All the classical examples of scalar products, each giving rise to important classes of structured matrices, are shown to be both orthosymmetric and unitary.

Structured Polynomial Eigenproblems Related to Time-Delay Systens (with Heike Faßbender, D. Steven  Mackey, and Christian Schröder).   Electronic Transactions on Numerical Analysis, v.31, pp. 306--330, 2009.
A new class of structured polynomial eigenproblem arising in the stability analysis of time-delay systems is identified and analyzed together with new types of closely related structured polynomials. Relationships between these polynomials are established via the Cayley transformation. Their spectral symmetries are revealed, and structure-preserving linearizations constructed. A structured Schur decomposition for the class of structured pencils associated with time-delay systems is derived, and an algorithm for its computation that compares favorably with the QZ algorithm is presented along with numerical experiments.

Numerical Methods for Palindromic Eigenvalue Problems: Computing the Anti-Triangular Schur form (with D. Steven  Mackey, Christian Mehl and Volker Mehrmann).   Numerical Linear Algebra with Applications. v.16, no. 1, pp. 63--86, 2009.
We present structure-preserving numerical methods for the eigenvalue problem of complex palindromic pencils. Such problems arise in control theory, as well as from palindromic linearizations of higher degree palindromic matrix polynomials. A key ingredient of these methods is the development of an appropriate condensed form --- the anti-triangular Schur form. Ill-conditioned problems with eigenvalues near the unit circle, in particular near $\pm 1$, are discussed. We show how a combination of unstructured methods followed by a structured refinement can be used to solve such problems accurately.

Structured Mapping Problems for Matrices Associated with Scalar Products Part I: Lie and Jordan Algebras (with D. Steven  Mackey and Françoise Tisseur).   SIAM J. of Matrix Analysis and Appl., v.29, No.4, pp. 1389 - 1410, 2008.
Given a class of structured matrices S, we identify pairs of vectors x,b for which there exists a matrix A in S such that Ax=b, and also characterize the set of all matrices A in S mapping x to b. The structured classes we consider are the Lie and Jordan algebras associated with orthosymmetric scalar products. These include (skew-)symmetric, (skew-)Hamiltonian, pseudo (skew-)Hermitian, persymmetric and perskew-symmetric matrices. Structured mappings with extremal properties are also investigated. In particular, structured mappings of minimal rank are identified and shown to be unique when rank one is achieved. The structured mapping of minimal Frobenius norm is always unique and explicit formulas for it and its norm are obtained. Finally the set of all structured mappings of minimal 2-norm is characterized. Our results generalize and unify existing work, answer a number of open questions, and provide useful tools for structured backward error investigations.

Symmetric Linearizations for Matrix Polynomials (with Nicholas J. Higham, D. Steven  Mackey and Françoise Tisseur). SIAM J. of Matrix Analysis and Appl., v.29, No.1, pp. 143 - 159, 2006.
A standard way of treating the polynomial eigenvalue problem P(\l)x = 0 is to convert it into an equivalent matrix pencil---a process known as linearization. Two vector spaces of pencils Ell_1(P) and Ell_2(P), and their intersection DL(P), have recently been defined and studied by Mackey, Mackey, Mehl, and Mehrmann. The aim of our work is to gain new insight into these spaces and the extent to which their constituent pencils inherit structure from P. For arbitrary polynomials we show that every pencil in DL(P) is block symmetric and we obtain a convenient basis for DL(P) built from block Hankel matrices. This basis is then exploited to prove that the first deg(P) pencils in a sequence constructed by Lancaster in the 1960s generate DL(P). When P is symmetric, we show that the symmetric pencils in Ell_1(P) comprise DL(P), while for Hermitian P the Hermitian pencils in Ell_1(P) form a proper subset of DL(P) that we explicitly characterize. Almost all pencils in each of these subsets are shown to be linearizations. In addition to obtaining new results, this work provides a self-contained treatment of some of the key properties of DL(P) together with some new, more concise proofs.

Structured Polynomial Eigenvalue Problems: Good Vibrations from Good Linearizations (with D. Steven Mackey, Christian Mehl and Volker Mehrmann).  SIAM J. of Matrix Analysis and Appl., v.28, No.4, pp. 1029 - 1051, 2006.
Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomials (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of linearizations that reflect the structure of these polynomials, and therefore preserve symmetries in their spectra, is introduced and investigated. We analyze the existence and uniqueness of such linearizations, and show how they may be systematically constructed.
Palindromic Polynomial Eigenvalue Problems: Good Vibrations from Good Linearizations (with D. Steven Mackey, Christian Mehl and Volker Mehrmann).  Technical Report 239, DFG Research Center, MATHEON, Berlin, Germany. April 2005.
An older version of the previous paper that has a more extensive section on applications.

Vector Spaces of Linearizations for Matrix Polynomials (with D. Steven Mackey, Christian Mehl and Volker Mehrmann).  SIAM J. of Matrix Analysis and Appl. v.28, No.4, pp. 971 - 1004, 2006.
The classical approach to investigating polynomial eigenvalue problems is linearization, where the polynomial is converted into a larger matrix pencil with the same eigenvalues. For any polynomial there are infinitely many linearizations with widely varying properties, but in practice the companion forms are typically used. However, these companion forms are not always entirely satisfactory, and linearizations with special properties may sometimes be required. In this paper we develop a systematic approach to generating large classes of linearizations for matrix polynomials. Given a polynomial P, we show how to simply construct two vector spaces of pencils that generalize the companion forms of P, and prove that almost all of these pencils are linearizations for P. Eigenvectors of these pencils are shown to be closely related to those of P. A distinguished subspace is then isolated, and the special properties of these pencils are investigated. These spaces of pencils provide a convenient arena in which to look for structured linearizations of structured polynomials, as well as to try to optimize the conditioning of linearizations, issues to be addressed in further work.

Structured Factorizations in Scalar Product Spaces (with D. Steven  Mackey and Françoise Tisseur). Numerical Analysis Report No. 421, Manchester Centre for Computational Mathematics, Manchester, England. November 2004. Revised April 2005. SIAM J. of Matrix Analysis and Appl., v.27, No.3, pp. 821 - 850. January 2006.
Let A belong to an automorphism group, Lie algebra or Jordan algebra of a scalar product. When A is factored, to what extent do the factors inherit structure from A? We answer this question for the principal matrix square root, the matrix sign decomposition, and the polar decomposition. For general A, we give a simple derivation and characterization of a particular generalized polar decomposition, and we relate it to other such decompositions in the literature. Finally, we study eigendecompositions and structured singular value decompositions, considering in particular the structure in eigenvalues, eigenvectors and singular values that persists across a wide range of scalar products.

A key feature of our analysis is the identification of two particular classes of scalar products, termed unitary and orthosymmetric, which serve to unify assumptions for the existence of structured factorizations. A variety of different characterizations of these scalar product classes is given.

Functions Preservng Matrix Groups and Iterations for the Matrix Square Root (with Nicholas J. Higham, D. Steven  Mackey and Françoise Tisseur). SIAM J. of Matrix Analysis and Appl., v.26, No. 3, April 2005, pp. 849 - 877. Numerical Analysis Report No. 446, Manchester Centre for Computational Mathematics, Manchester, England. March 2004.
For which functions f does A \in G imply f(A) \in G, when G is the matrix automorphism group associated with a bilinear or sesquilinear form? For example, if A is symplectic when is f(A) symplectic? We show that group structure is preserved precisely when f(A^{-1}) = f(A)^{-1} for bilinear forms and when f(A^{-*}) = f(A)^{-*} for sesquilinear forms. Meromorphic functions that satisfy each of these conditions are characterized. Related to structure preservation is the condition f(\overline{A}) = \overline{f(A)}, and analytic functions and rational functions satisfying this condition are also characterized. These results enable us to characterize all meromorphic functions that map every G into itself as the ratio of a polynomial and its ``reversal'', up to a monomial factor and conjugation.

The principal square root is an important example of a function that preserves every automorphism group G. By exploiting the matrix sign function, a new family of coupled iterations for the matrix square root is derived. Some of these iterations preserve every G; all of them are shown, via a novel Fr\'echet derivative-based analysis, to be numerically stable.

A rewritten form of Newton's method for the square root of A \in G is also derived. Unlike the original method, this new form has good numerical stability properties, and we argue that it is the iterative method of choice for computing A^{1/2} when A \in G. Our tools include a formula for the sign of a certain block 2x2 matrix, the generalized polar decomposition along with a wide class of iterations for computing it, and a connection between the generalized polar decomposition of I+A and the square root of A \in G.

Structure Preserving Algorithms for Perplectic Eigenproblems (with D. Steven Mackey and D. Dunlavy). The Electronic Journal of Linear Algebra (13), 2005, pp. 10 - 39. Numerical Analysis Report No. 427, Manchester Centre for Computational Mathematics, Manchester, England. May 2003.
Structured real canonical forms for nxn real matrices that are symmetric or skew-symmetric about the anti-diagonal as well as the main diagonal are presented, and Jacobi algorithms for solving the complete eigenproblem for three of these four classes of matrices are developed. Based on the direct solution of 4x4 subproblems constructed via quaternions, the algorithms calculate structured orthogonal bases for the invariant subspaces of the associated matrix. In addition to preserving structure, these methods are inherently parallelizable, numerically stable, and show asymptotic quadratic convergence.

Computing the Polar Decomposition and the Matrix Sign Decomposition in Matrix Groups (with Nicholas J. Higham, D. Steven  Mackey and Françoise Tisseur). SIAM J. of Matrix Analysis and Appl., v.25, No. 4, 2004, pp. 1178 - 1192. Numerical Analysis Report No. 426, Manchester Centre for Computational Mathematics, Manchester, England. April 2003.
For any matrix automorphism group G associated with a bilinear or sesquilinear form, Mackey, Mackey, and Tisseur have recently shown that the matrix sign decomposition factors of A in G also lie in G; moreover, the polar factors of A lie in G if the matrix of the underlying form is unitary. Groups satisfying the latter condition include the complex orthogonal, real and complex symplectic, and pseudo-orthogonal groups.

This work is concerned with exploiting the structure of G when computing the polar and matrix sign decompositions of matrices in G.  We give sufficient conditions for a matrix iteration to preserve the group structure and show that a family of globally convergent rational Padé-based iterations of Kenney and Laub satisfy these conditions.  The well-known scaled Newton iteration for computing the unitary polar factor does not preserve group structure, but we show that the approach of the iterates to the group is precisely tethered to the approach to unitarity, and that this forces a different and exploitable structure in the iterates.  A similar relation holds for the Newton iteration for the matrix sign function.  We also prove that the number of iterations needed for convergence of the structure-preserving methods can be precisely predicted by running an associated scalar iteration.  Numerical experiments are given to compare the cubically and quintically converging iterations with Newton's method and to test stopping criteria.  The overall conclusion is that the structure-preserving iterations and the scaled Newton iteration are all of practical interest, and which iteration is to be preferred is problem-dependent.

On the Determinant of Symplectic Matrices, (with D. Steven Mackey). Numerical Analysis Report No. 422, Manchester Centre for Computational Mathematics, Manchester, England. February 2003.
A collection of new and old proofs showing that the determinant of any symplectic matrix is +1 is presented. Structured factorizations of symplectic matrices play a key role in several arguments. A constructive derivation of the symplectic analogue of the Cartan-Dieudonné theorem is one of the new proofs in this essay.

G-Reflectors: Analogues of Householder Transformations in Scalar Product Spaces, (with D. Steven Mackey and Françoise Tisseur). Linear Algebra and it Applications, v. 385, 2004, pp. 187 - 213. Numerical Analysis Report No. 420, Manchester Centre for Computational Mathematics, Manchester, England. February 2003. Revised June 2003.
We characterize the analogues of Householder reflectors in matrix groups associated with scalar products, and precisely delimit their mapping capabilities: given a matrix group G and vectors x, y, necessary and sufficient conditions are derived for the existence of a Householder-like analogue H in G such that Hx = y. When H exists, we show how it can be constructed from x and y. Examples of matrix groups to which these results apply include the symplectic and pseudo-unitary groups.

Structured Tools for Structured Matrices, (with D. Steven Mackey and Françoise Tisseur). The Electronic Journal of Linear Algebra (10), 2003, pp. 106 - 145. Numerical Analysis Report No. 419, Manchester Centre for Computational Mathematics, Manchester, England. February 2003.
We present an extensive and unified collection of structure-preserving transformations, organized for easy reference. The structures we work with arise in the context of a non-degenerate bilinear or sesquilinear form on R^n or C^n. We construct a variety of transformations belonging to the automorphism groups of these forms, that imitate the action of Givens rotations, Householder reflectors, and Gauss transformations. We also describe transformations for performing structured scaling actions. The matrix groups considered in this paper are the complex orthogonal, real, complex and conjugate symplectic, real perplectic, real and complex pseudo-orthogonal, and pseudo-unitary groups. In addition to deriving new transformations, this paper collects and unifies existing structure-preserving tools.

Hamilton and Jacobi Come Full Circle: Jacobi Algorithms for Structured Hamiltonian Eigenproblems, (with H.eike Fassbender and D. Steven Mackey). Linear Algebra and its Applications, 332 - 334, August 2001, pp. 37 - 80.
We develop Jacobi algorithms for solving the complete eigenproblem for Hamiltonian and skew-Hamiltonian matrices that are also symmetric or skew-symmetric.  Based on the direct solution of 4 x 4, and in one case, 8 x 8 subproblems, these structure preserving algorithms produce symplectic orthogonal bases for the invariant subspaces associated with a matrix in any one of the four classes under consideration. The key step in the construction of the algorithms is a quaternion characterization of the 4 x 4 symplectic orthogonal group, and the subspaces of 4 x 4 Hamiltonian, skew-Hamiltonian, symmetric and skew-symmetric matrices.  In addition to preserving structure, these algorithms are inherently parallelizable, numerically stable, and show asymptotic quadratic convergence.

Is Every Matrix Similar to a Toeplitz Matrix?, (with D. Steven Mackey and Srdjan Petrovic). Linear Algebra and its Applications, v. 297, 1999. pp. 87-105.
We show that every n x n complex nonderogatory matrix is similar to a unique unit upper Hessenberg Toeplitz matrix.  The proof is constructive, and can be adapted to nonderogatory matrices with entries in any field of characteristic zero or characteristic greater than n.  We also prove that every n x n complex matrix with n <=4 is similar to a Toeplitz matrix.



Real and complex Hamiltonian square roots of skew-Hamiltonian matrices, (with Heike Fassbender, D. Steven Mackey and Hongguo Xu). Western Michigan University, Technical Report #92, January 1999.
We present a constructive existence proof that every real skew-Hamiltonian matrix W has a real Hamiltonian square root.   The key step in this construction shows how one may bring any such W into a real quasi-Jordan canonical form  via symplectic similarity.  We show further that every W has infinitely many real Hamiltonian square roots, and give a lower bound on the dimension of the set of all such square roots.  Some extensions to complex matrices are also presented.

This report is an updated version of Hamiltonian square roots of skew-Hamiltonian matrices, (with Heike Fassbender, D. Steven Mackey and Hongguo Xu).   Linear Algebra and its Applications, v. 287, 1999. pp. 125-159
 


Hamilton and Jacobi meet again: quaternions and the eigenvalue problem, SIAM Journal on Matrix Analysis and Applications, Vol. 16, pp. 421--435, April 1995.
 
The algebra isomorphism between the space of 4 x 4 real matrices and the tensor product of the quaternions has unexpected computational payoff: it helps construct an orthogonal similarity that 2 x 2 block-diagonalizes a 4 x 4 symmetric matrix.  Replacing plane rotations with these more powerful  4 x 4 rotations leads to a quaternion-Jacobi method in which the `weight' of four elements (in a 2 x 2 block) is transferred all at once onto the diagonal.  Quadratic convergence sets in sooner, and the new method requires at least one fewer sweep than plane-Jacobi methods.   An analogue of the sorting angle for plane rotations is developed for these 4 x 4 rotations.



 
Niloufer Mackey