polynomial eigenvalue problem
Recently Published Documents


TOTAL DOCUMENTS

30
(FIVE YEARS 10)

H-INDEX

4
(FIVE YEARS 1)

2020 ◽  
pp. 195-210
Author(s):  
Federico Milano ◽  
Ioannis Dassios ◽  
Muyang Liu ◽  
Georgios Tzounas

2020 ◽  
Vol 36 (36) ◽  
pp. 799-833
Author(s):  
Maria Isabel Bueno Cachadina ◽  
Javier Perez ◽  
Anthony Akshar ◽  
Daria Mileeva ◽  
Remy Kassem

One strategy to solve a nonlinear eigenvalue problem $T(\lambda)x=0$ is to solve a polynomial eigenvalue problem (PEP) $P(\lambda)x=0$ that approximates the original problem through interpolation. Then, this PEP is usually solved by linearization. Because of the polynomial approximation techniques, in this context, $P(\lambda)$ is expressed in a non-monomial basis. The bases used with most frequency are the Chebyshev basis, the Newton basis and the Lagrange basis. Although, there exist already a number of linearizations available in the literature for matrix polynomials expressed in these bases, new families of linearizations are introduced because they present the following advantages: 1) they are easy to construct from the matrix coefficients of $P(\lambda)$ when this polynomial is expressed in any of those three bases; 2) their block-structure is given explicitly; 3) it is possible to provide equivalent formulations for all three bases which allows a natural framework for comparison. Also, recovery formulas of eigenvectors (when $P(\lambda)$ is regular) and recovery formulas of minimal bases and minimal indices (when $P(\lambda)$ is singular) are provided. The ultimate goal is to use these families to compare the numerical behavior of the linearizations associated to the same basis (to select the best one) and with the linearizations associated to the other two bases, to provide recommendations on what basis to use in each context. This comparison will appear in a subsequent paper.


The objective of this study is to ef-ciently resolve a perturbed symmetric eigen-value problem, without resolving a completelynew eigenvalue problem. When the size of aninitial eigenvalue problem is large, its multipletimes solving for each set of perturbations can becomputationally expensive and undesired. Thistype of problems is frequently encountered inthe dynamic analysis of mechanical structures.This study deals with a perturbed symmetriceigenvalue problem. It propose to develop atechnique that transforms the perturbed sym-metric eigenvalue problem, of a large size, toa symmetric polynomial eigenvalue problem ofa much reduced size. To accomplish this, weonly need the introduced perturbations, the sym-metric positive-de nite matrices representing theunperturbed system and its rst eigensolutions.The originality lies in the structure of the ob-tained formulation, where the contribution of theunknown eignsolutions of the unperturbed sys-tem is included. The e ectiveness of the pro-posed method is illustrated with numerical tests.High quality results, compared to other existingmethods that use exact reanalysis, can be ob-tained in a reduced calculation time, even if theintroduced perturbations are very signi cant.


2020 ◽  
Vol 54 (3) ◽  
pp. 114-118
Author(s):  
Apostolos Chalkis ◽  
Vissarion Fisikopoulos ◽  
Panagiotis Repouskos ◽  
Elias Tsigaridas

We present algorithmic, complexity, and implementation results on the problem of sampling points in the interior and the boundary of a spectrahedron, that is the feasible region of a semidefinite program. Our main tool is random walks. We define and analyze a set of primitive geometric operations that exploits the algebraic properties of spectrahedra and the polynomial eigenvalue problem and leads to the realization of a broad collection of efficient random walks. We demonstrate random walks that experimentally show faster mixing time than the ones used previously for sampling from spectrahedra in theory or applications, for example Hit and Run. Consecutively, the variety of random walks allows us to sample from general probability distributions, for example the family of log-concave distributions which arise frequently in numerous applications. We apply our tools to compute (i) the volume of a spectrahedron and (ii) the expectation of functions coming from robust optimal control. We provide a C++ open source implementation of our methods that scales efficiently up to dimension 200. We illustrate its efficiency on various data sets.


2019 ◽  
Vol 35 ◽  
pp. 116-155
Author(s):  
Biswajit Das ◽  
Shreemayee Bora

The complete eigenvalue problem associated with a rectangular matrix polynomial is typically solved via the technique of linearization. This work introduces the concept of generalized linearizations of rectangular matrix polynomials. For a given rectangular matrix polynomial, it also proposes vector spaces of rectangular matrix pencils with the property that almost every pencil is a generalized linearization of the matrix polynomial which can then be used to solve the complete eigenvalue problem associated with the polynomial. The properties of these vector spaces are similar to those introduced in the literature for square matrix polynomials and in fact coincide with them when the matrix polynomial is square. Further, almost every pencil in these spaces can be `trimmed' to form many smaller pencils that are strong linearizations of the matrix polynomial which readily yield solutions of the complete eigenvalue problem for the polynomial. These linearizations are easier to construct and are often smaller than the Fiedler linearizations introduced in the literature for rectangular matrix polynomials. Additionally, a global backward error analysis applied to these linearizations shows that they provide a wide choice of linearizations with respect to which the complete polynomial eigenvalue problem can be solved in a globally backward stable manner.


Sign in / Sign up

Export Citation Format

Share Document