scholarly journals Corrigendum to "Determinants of Normalized Bohemian Upper Hessenberg Matrices"

2021 ◽  
Vol 37 ◽  
pp. 160-162
Author(s):  
Massimiliano Fasi ◽  
Jishe Feng ◽  
Gian Maria Negri Porzio

An amended version of Proposition 3.6 of [Fasi and Negri Porzio, Electron. J. Linear Algebra 36:352--366, 2020] is presented. The result shows that the set of possible determinants of upper Hessenberg matrices with ones on the subdiagonal and elements in the upper triangular part drawn from the set $\{-1,1\}$ is $\{ 2k \mid k \in \langle -2^{n-2} , 2^{n-2} \rangle \}$, instead of $\{ 2k \mid k \in \langle -n+1, n-1 \rangle \}$ as previously stated. This does not affect the main results of the article being corrected and shows that Conjecture 20 in the Characteristic Polynomial Database is true.

2021 ◽  
Vol 37 ◽  
pp. 193-210
Author(s):  
Alberto Borobia ◽  
Roberto Canogar

In recent years, there has been a growing interest in companion matrices. Sparse companion matrices are well known: every sparse companion matrix is equivalent to a Hessenberg matrix of a particular simple type. Recently, Deaett et al. [Electron. J. Linear Algebra, 35:223--247, 2019] started the systematic study of nonsparse companion matrices. They proved that every nonsparse companion matrix is nonderogatory, although not necessarily equivalent to a Hessenberg matrix. In this paper, the nonsparse companion matrices which are unit Hessenberg are described. In a companion matrix, the variables are the coordinates of the characteristic polynomial with respect to the monomial basis. A PB-companion matrix is a generalization, in the sense that the variables are the coordinates of the characteristic polynomial with respect to a general polynomial basis. The literature provides examples with Newton basis, Chebyshev basis, and other general orthogonal bases. Here, the PB-companion matrices which are unit Hessenberg are also described.


Author(s):  
Jishe Feng ◽  
Hongtao Fan

In this paper, we deduce explicit formulas to evaluate the determinants of nonsymmetrical structure Toeplitz Bohemians by two determinants of specific Hessenberg Toeplitz matrices, which are linear combinations in terms of determinants of specific Hessenberg Toeplitz matrices. We get some new results very di¤erent from [Massimiliano Fasi, Gian Maria Negri Porzio, Determinants of normalized upper Hessenberg matrices, Electronic Journal of Linear Algebra, Volume 36, pp. 352-366, June 2020].


2021 ◽  
Vol 2068 (1) ◽  
pp. 012007
Author(s):  
Jishe Feng ◽  
Hongtao Fan

Abstract The pentadiagonal Toeplitz matrix is a special kind of sparse matrix widely used in linear algebra, combinatorics, computational mathematics, and has been attracted much attention. We use the determinants of two specific Hessenberg matrices to represent the recurrence relations to prove two explicit formulae to evaluate the determinants of specific pentadiagonal Toeplitz matrices proposed in a recent paper [3]. Further, four new results are established.


2015 ◽  
Vol 30 ◽  
pp. 934-943
Author(s):  
Piet Van Mieghem

A Lagrange series around adjustable expansion points to compute the eigenvalues of graphs, whose characteristic polynomial is analytically known, is presented. The computations for the kite graph P_nK_m, whose largest eigenvalue was studied by Stevanovic and Hansen [D. Stevanovic and P. Hansen. The minimum spectral radius of graphs with a given clique number. Electronic Journal of Linear Algebra, 17:110–117, 2008.], are illustrated. It is found that the first term in the Lagrange series already leads to a better approximation than previously published bounds.


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
Rob Corless

This Maple Workbook explores a new topic in linear algebra, which is called "Bohemian Matrices". The topic is accessible to people who have had even just one linear algebra course, or have arrived at the point in their course where they have touched "eigenvalues". We use only the concepts of characteristic polynomial and eigenvalue. Even so, we will see some open questions, things that no-one knows for sure; even better, this is quite an exciting new area and we haven't even finished asking the easy questions yet! So it is possible that the reader will have found something new by the time they have finished going through this workbook. Reading this workbook is not like reading a paper: you will want to execute the code, and change things, and try alternatives. You will want to read the code, as well. I have tried to make it self-explanatory. We will begin with some pictures, and then proceed to show how to make such pictures using Maple (or, indeed, many other computational tools). Then we start asking questions about the pictures, and about other things.


Sign in / Sign up

Export Citation Format

Share Document