**Speaker:**
Heike Fassbender, University of Braunschweig

**Location:**
Warren Weaver Hall 1302

**Date:**
Sept. 8, 2017, 10 a.m.

**Synopsis:**

Polynomial eigenvalue problems (PEPs) \(P(\lambda)x = 0\), where $$P(\lambda) = \sum^{k}_{i=0}\lambda^iA_i$$ with real or complex \(n \times n\) coefficient matrices \(A_i\), appear in a large number of applications. The classical approach to investigating PEPs is linearization, where the polynomial is converted into a larger matrix pencil with the same eigenvalues. About a decade ago, the vector space \(\mathbb{L}_1(P)\) of matrix pencils corresponding to a matrix polynomial \(P(\lambda)\) was introduced. Its elements satisfy a certain ansatz equation and may be regarded as generalizations of the Frobenius companion pencils. This vector space contains a great many of (structured) strong linearizations of \(P(\lambda)\).

After a brief review of the vector space \(\mathbb{L}_1(P)\) of possible linearizations of \(P\), we will address two problems:

- Matrix polynomials expressed in bases other than the standard basis arise directly from applications, or as approximations when solving more general nonlinear eigenvalue problems. It is tempting to convert such matrix polynomials to the standard basis and to use what is known about the linearization for matrix polynomials in the standard basis. But a change of basis can be unstable.
- The concept of \(\mathbb{L}_1(P)\) holds only for square matrix polynomials, but there are applications in which matrix polynomials with nonsquare matrix coefficients arise.

We will first present a generalization of \(\mathbb{L}_1(P)\) to matrix polynomials in orthogonal basis. Next we will derive a new family of ansatz spaces which allows to treat nonsquare matrix polynomials. In both cases the proposed novel vector spaces serve as an abundant source of (structured) strong linearizations.