hessenberg matrix
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 11)

H-INDEX

8
(FIVE YEARS 0)

2022 ◽  
Vol 48 (1) ◽  
pp. 1-36
Author(s):  
Mirko Myllykoski

The QR algorithm is one of the three phases in the process of computing the eigenvalues and the eigenvectors of a dense nonsymmetric matrix. This paper describes a task-based QR algorithm for reducing an upper Hessenberg matrix to real Schur form. The task-based algorithm also supports generalized eigenvalue problems (QZ algorithm) but this paper concentrates on the standard case. The task-based algorithm adopts previous algorithmic improvements, such as tightly-coupled multi-shifts and Aggressive Early Deflation (AED) , and also incorporates several new ideas that significantly improve the performance. This includes, but is not limited to, the elimination of several synchronization points, the dynamic merging of previously separate computational steps, the shortening and the prioritization of the critical path, and experimental GPU support. The task-based implementation is demonstrated to be multiple times faster than multi-threaded LAPACK and ScaLAPACK in both single-node and multi-node configurations on two different machines based on Intel and AMD CPUs. The implementation is built on top of the StarPU runtime system and is part of the open-source StarNEig library.


2021 ◽  
Vol 27 (4) ◽  
pp. 116-121
Author(s):  
Milica Anđelic ◽  
◽  
Carlos M. da Fonseca ◽  
◽  

In this short note we propose two determinantal representations for the number of subsequences without isolated odd terms are presented. One is based on a tridiagonal matrix and other on a Hessenberg matrix. We also establish a new explicit formula for the terms of this sequence based on Chebyshev polynomials of the second kind.


Author(s):  
R. М. Peleshchak ◽  
V. V. Lytvyn ◽  
О. І. Cherniak ◽  
І. R. Peleshchak ◽  
М. V. Doroshenko

Context. To reduce the computational resource time in the problems of diagnosing and recognizing distorted images based on a fully connected stochastic pseudospin neural network, it becomes necessary to thin out synaptic connections between neurons, which is solved using the method of diagonalizing the matrix of synaptic connections without losing interaction between all neurons in the network. Objective. To create an architecture of a stochastic pseudo-spin neural network with diagonal synaptic connections without loosing the interaction between all the neurons in the layer to reduce its learning time. Method. The paper uses the Hausholder method, the method of compressing input images based on the diagonalization of the matrix of synaptic connections and the computer mathematics system MATLAB for converting a fully connected neural network into a tridiagonal form with hidden synaptic connections between all neurons. Results. We developed a model of a stochastic neural network architecture with sparse renormalized synaptic connections that take into account deleted synaptic connections. Based on the transformation of the synaptic connection matrix of a fully connected neural network into a Hessenberg matrix with tridiagonal synaptic connections, we proposed a renormalized local Hebb rule. Using the computer mathematics system “WolframMathematica 11.3”, we calculated, as a function of the number of neurons N, the relative tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network. Conclusions. We found that with an increase in the number of neurons, the tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network, decreases according to a hyperbolic law. Depending on the direction of pseudospin neurons, we proposed a classification of a renormalized neural network with a ferromagnetic structure, an antiferromagnetic structure, and a dipole glass.


2021 ◽  
Vol 37 ◽  
pp. 193-210
Author(s):  
Alberto Borobia ◽  
Roberto Canogar

In recent years, there has been a growing interest in companion matrices. Sparse companion matrices are well known: every sparse companion matrix is equivalent to a Hessenberg matrix of a particular simple type. Recently, Deaett et al. [Electron. J. Linear Algebra, 35:223--247, 2019] started the systematic study of nonsparse companion matrices. They proved that every nonsparse companion matrix is nonderogatory, although not necessarily equivalent to a Hessenberg matrix. In this paper, the nonsparse companion matrices which are unit Hessenberg are described. In a companion matrix, the variables are the coordinates of the characteristic polynomial with respect to the monomial basis. A PB-companion matrix is a generalization, in the sense that the variables are the coordinates of the characteristic polynomial with respect to a general polynomial basis. The literature provides examples with Newton basis, Chebyshev basis, and other general orthogonal bases. Here, the PB-companion matrices which are unit Hessenberg are also described.


2021 ◽  
Vol 19 (1) ◽  
pp. 505-514
Author(s):  
Zhibin Du ◽  
Carlos M. da Fonseca ◽  
Yingqiu Xu ◽  
Jiahao Ye
Keyword(s):  

Abstract In this paper, we disprove a remaining conjecture about Bohemian matrices, in which the numbers of distinct determinants of a normalized Bohemian upper-Hessenberg matrix were conjectured.


CAUCHY ◽  
2020 ◽  
Vol 6 (3) ◽  
pp. 140-148
Author(s):  
Nur Khasanah ◽  
Agustin Absari Wahyu Kuntarini

The application of centrosymmetric matrix on engineering take their part, particulary about determinat rule. This basic rule needs computational process for determining appropiate algorithm. Therefore, by the algorithm of determinant kind of Hessenberg matrix, this is used for computing determinant of centrosymmetric matrix more efficiently. This paper shows the algorithm of lower Hessenberg and sparse Hessenberg matrix to construct the efficient alforithm of determinant of centrosymmetric matrix. By using the special structure of centrosymmetric matrix, the algorithm of these determinant are usefull for their own characterstics.


2020 ◽  
Vol 36 (36) ◽  
pp. 352-366
Author(s):  
Massimiliano Fasi ◽  
Gian Maria Negri Porzio

A matrix is Bohemian if its elements are taken from a finite set of integers. An upper Hessenberg matrix is normalized if all its subdiagonal elements are ones, and hollow if it has only zeros along the main diagonal. All possible determinants of families of normalized and hollow normalized Bohemian upper Hessenberg matrices are enumerated. It is shown that in the case of hollow matrices the maximal determinants are related to a generalization of Fibonacci numbers. Several conjectures recently stated by Corless and Thornton follow from these results.


Symmetry ◽  
2020 ◽  
Vol 12 (3) ◽  
pp. 333
Author(s):  
Pranab Kumar Dhar ◽  
Azizul Hakim Chowdhury ◽  
Takeshi Koshiba

Digital watermarking has been widely utilized for ownership protection of multimedia contents. This paper introduces a blind symmetric audio watermarking algorithm based on parametric Slant-Hadamard transform (PSHT) and Hessenberg decomposition (HD). In our proposed algorithm, at first watermark image is preprocessed to enhance the security. Then, host signal is divided into non-overlapping frames and the samples of each frame are reshaped into a square matrix. Next, PSHT is performed on each square matrix individually and a part of this transformed matrix of size m×m is selected and HD is applied to it. Euclidean normalization is calculated from the 1st column of the Hessenberg matrix, which is further used for embedding and extracting the watermark. Simulation results ensure the imperceptibility of the proposed method for watermarked audios. Moreover, it is demonstrated that the proposed algorithm is highly robust against numerous attacks. Furthermore, comparative analysis substantiates its superiority among other state-of-the-art methods.


2020 ◽  
Vol 15 ◽  
pp. 5
Author(s):  
Björn Gustafsson ◽  
Mihai Putinar

The exponential orthogonal polynomials encode via the theory of hyponormal operators a shade function g supported by a bounded planar shape. We prove under natural regularity assumptions that these complex polynomials satisfy a three term relation if and only if the underlying shape is an ellipse carrying uniform black on white. More generally, we show that a finite term relation among these orthogonal polynomials holds if and only if the first row in the associated Hessenberg matrix has finite support. This rigidity phenomenon is in sharp contrast with the theory of classical complex orthogonal polynomials. On function theory side, we offer an effective way based on the Cauchy transforms of g,z̅g,…,z̅dg, to decide whether a (d + 2)-term relation among the exponential orthogonal polynomials exists; in that case we indicate how the shade function g can be reconstructed from a resulting polynomial of degree d and the Cauchy transform of g. A discussion of the relevance of the main concepts in Hele-Shaw dynamics completes the article.


2019 ◽  
Vol 30 (08) ◽  
pp. 1279-1300
Author(s):  
Sraban Kumar Mohanty ◽  
G. Sajith

Reduction of an [Formula: see text] nonsymmetric matrix to Hessenberg form which takes [Formula: see text] flops and [Formula: see text] I/Os is a major performance bottleneck in the computing of its eigenvalues. Usually to improve the performance, this Hessenberg reduction is computed in two steps: the first one reduces the matrix to a banded Hessenberg form, and the second one further reduces it to Hessenberg form by incorporating more matrix-matrix operations in the computation. We analyse the two steps of the Hessenberg reduction problem on the external memory model (of Aggarwal and Vitter) for their I/O complexities. We propose and analyse a tile based algorithm for the first step of the reduction and show that it takes [Formula: see text] I/Os. For the reduction of a banded Hessenberg matrix of bandwidth [Formula: see text] to Hessenberg form, we propose an algorithm, which uses tight packing of bulges, and requires only [Formula: see text] I/Os. Combining the results of the two steps of the reduction, we show that the Hessenberg reduction can be performed in [Formula: see text] I/Os, when [Formula: see text] is sufficiently large. To the best of our knowledge, the proposed algorithm is the first I/O optimal algorithm for Hessenberg reduction; the worst case I/O complexity matches the known lower bound of the problem.


Sign in / Sign up

Export Citation Format

Share Document