block matrices
Recently Published Documents


TOTAL DOCUMENTS

282
(FIVE YEARS 50)

H-INDEX

20
(FIVE YEARS 2)

Author(s):  
Jean-Christophe Bourin ◽  
Eun-Young Lee

We prove the operator norm inequality, for a positive matrix partitioned into four blocks in [Formula: see text], [Formula: see text] where [Formula: see text] is the diameter of the largest possible disc in the numerical range of [Formula: see text]. This shows that the inradius [Formula: see text] satisfies [Formula: see text] Several eigenvalue inequalities are derived. In particular, if [Formula: see text] is a normal matrix whose spectrum lies in a disc of radius [Formula: see text], the third eigenvalue of the full matrix is bounded by the second eigenvalue of the sum of the diagonal block, [Formula: see text] We think that [Formula: see text] is optimal and we propose a conjecture related to a norm inequality of Hayashi.


2021 ◽  
Vol 20 ◽  
pp. 625-629
Author(s):  
Ahmad Abu Rahma ◽  
Aliaa Burqan ◽  
Özen Özer

Matrix theory is very popular in different kind of sciences such as engineering, architecture, physics, chemistry, computer science, IT, so on as well as mathematics many theoretical results dealing with the structure of the matrices even this topic seems easy to work. That is why many scientists still consider some open problem in matrix theory. In this paper, generalizations of the arithmetic-geometric mean inequality is presented for singular values related to block matrices. Singular values are also given for sums, products and direct sums of the matrices.


2021 ◽  
Vol 13 (16) ◽  
pp. 3196
Author(s):  
Wei Liu ◽  
Chengxun He ◽  
Le Sun

During the imaging process, hyperspectral image (HSI) is inevitably affected by various noises, such as Gaussian noise, impulse noise, stripes or deadlines. As one of the pre-processing steps, the removal of mixed noise for HSI has a vital impact on subsequent applications, and it is also one of the most challenging tasks. In this paper, a novel spectral-smoothness and non-local self-similarity regularized subspace low-rank learning (termed SNSSLrL) method was proposed for the mixed noise removal of HSI. First, under the subspace decomposition framework, the original HSI is decomposed into the linear representation of two low-dimensional matrices, namely the subspace basis matrix and the coefficient matrix. To further exploit the essential characteristics of HSI, on the one hand, the basis matrix is modeled as spectral smoothing, which constrains each column vector of the basis matrix to be a locally continuous spectrum, so that the subspace formed by its column vectors has continuous properties. On the other hand, the coefficient matrix is divided into several non-local block matrices according to the pixel coordinates of the original HSI data, and block-matching and 4D filtering (BM4D) is employed to reconstruct these self-similar non-local block matrices. Finally, the formulated model with all convex items is solved efficiently by the alternating direction method of multipliers (ADMM). Extensive experiments on two simulated datasets and one real dataset verify that the proposed SNSSLrL method has greater advantages than the latest state-of-the-art methods.


Sign in / Sign up

Export Citation Format

Share Document