Low Rank Perturbation of Kronecker Structures without Full Rank

2007 ◽  
Vol 29 (2) ◽  
pp. 496-529 ◽  
Author(s):  
Fernando De Terán ◽  
Froilán M. Dopico
2003 ◽  
Vol 25 (2) ◽  
pp. 495-506 ◽  
Author(s):  
Julio Moro ◽  
Froilán M. Dopico

Author(s):  
Б.М. Глинский ◽  
В.И. Костин ◽  
Н.В. Кучин ◽  
С.А. Соловьев ◽  
В.А. Чеверда

Предложен алгоритм решения систем линейных алгебраических уравнений (СЛАУ), основанный на методе исключении Гаусса и предназначенный для решения уравнения Гельмгольца в трехмерных неоднородных средах. Для решения СЛАУ, возникающих в геофизических приложениях, разработана параллельная версия алгоритма, направленная на использование гетерогенных высокопроизводительных вычислительных систем, содержащих узлы с MPP- и SMP-архитектурой. Малоранговая аппроксимация, HSS-формат и динамическое распределение промежуточных результатов среди кластерных узлов позволяют решать задачи в разы большие, чем при использовании традиционных прямых методов, сохраняющих блоки $L$-фактора в полном ранге (Full-Rank, FR). Использование предложенного алгоритма позволяет сократить время расчетов, что актуально для решения трехмерных задач геофизики. Численные эксперименты подтверждают упомянутые преимущества предложенного малорангового прямого метода (Low-Rank, LR) по сравнению с прямыми FR-методами. На модельных геофизических задачах показана жизнеспособность реализованного алгоритма. An algorithm for solving systems of linear algebraic equations based on the Gaussian elimination method is proposed. The algorithm is aimed to solve boundary value problems for the Helmholtz equation in 3D heterogeneous media. In order to solve linear systems raised from geophysical applications, we developed a parallel version targeted on heterogeneous high-performance computing clusters (MPP and SMP architecture). Using the low-rank approximation technique and the HSS format allows us to solve problems larger than by the use of traditional direct solvers with saving the L-factor in full rank (FR). Using the proposed approach reduces computation time; it is the key-point of 3D geophysical problems. Numerical experiments demonstrate a number of advantages of the proposed low-rank approach in comparison with direct solvers (FR-approaches).


2021 ◽  
Author(s):  
Ryohei Sasaki ◽  
Katsumi Konishi ◽  
Tomohiro Takahashi ◽  
Toshihiro Furukawa

Abstract This paper deals with a problem of matrix completion in which each column vector of the matrix belongs to a low-dimensional differentiable manifold (LDDM), with the target matrix being high or full rank. To solve this problem, algorithms based on polynomial mapping and matrix-rank minimization (MRM) have been proposed; such methods assume that each column vector of the target matrix is generated as a vector in a low-dimensional linear subspace (LDLS) and mapped to a p-th order polynomial, and that the rank of a matrix whose column vectors are d-th monomial features of target column vectors is defficient. However, a large number of columns and observed values are needed to strictly solve the MRM problem using this method when p is large; therefore, this paper proposes a new method for obtaining the solution by minimizing the rank of the submatrix without transforming the target matrix, so as to obtain high estimation accuracy even when the number of columns is small. This method is based on the assumption that an LDDM can be approximated locally as an LDLS to achieve high completion accuracy without transforming the target matrix. Numerical examples show that the proposed method has a higher accuracy than other low-rank approaches.


2021 ◽  
Vol 35 (11) ◽  
pp. 1266-1267
Author(s):  
John Shaeffer

Basic Linear Algebra Subroutines (BLAS) are well-known low-level workhorse subroutines for linear algebra vector-vector, matrixvector and matrix-matrix operations for full rank matrices. The advent of block low rank (Rk) full wave direct solvers, where most blocks of the system matrix are Rk, an extension to the BLAS III matrix-matrix work horse routine is needed due to the agony of Rk addition. This note outlines the problem of BLAS III for Rk LU and solve operations and then outlines an alternative approach, which we will call BLAS IV. This approach utilizes the thrill of Rk matrix-matrix multiply and uses the Adaptive Cross Approximation (ACA) as a methodology to evaluate sums of Rk terms to circumvent the agony of low rank addition.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Yi-Ting Chen ◽  
Collin Farquhar ◽  
Robert M. Parrish

AbstractIn this work, we present an efficient rank-compression approach for the classical simulation of Kraus decoherence channels in noisy quantum circuits. The approximation is achieved through iterative compression of the density matrix based on its leading eigenbasis during each simulation step without the need to store, manipulate, or diagonalize the full matrix. We implement this algorithm using an in-house simulator and show that the low-rank algorithm speeds up simulations by more than two orders of magnitude over existing implementations of full-rank simulators, and with negligible error in the noise effect and final observables. Finally, we demonstrate the utility of the low-rank method as applied to representative problems of interest by using the algorithm to speed up noisy simulations of Grover’s search algorithm and quantum chemistry solvers.


2020 ◽  
Vol 39 (3) ◽  
pp. 3401-3412
Author(s):  
Yong Peng ◽  
Leijie Zhang ◽  
Wanzeng Kong ◽  
Feiwei Qin ◽  
Jianhai Zhang

Subspace learning aims to obtain the corresponding low-dimensional representation of high dimensional data in order to facilitate the subsequent data storage and processing. Graph-based subspace learning is a kind of effective subspace learning methods by modeling the data manifold with a graph, which can be included in the general spectral regression (SR) framework. By using the least square regression form as objective function, spectral regression mathematically avoids performing eign-decomposition on dense matrices and has excellent flexibility. Recently, spectral regression has obtained promising performance in diverse applications; however, it did not take the underlying classes/tasks correlation patterns of data into consideration. In this paper, we propose to improve the performance of spectral regression by exploring the correlation among classes with low-rank modeling. The newly formulated low-rank spectral regression (LRSR) model is achieved by decomposing the projection matrix in SR by two factor matrices which were respectively regularized. The LRSR objective function can be handled by the alternating direction optimization framework. Besides some analysis on the differences between LRSR and existing related models, we conduct extensive experiments by comparing LRSR with its full rank counterpart on benchmark data sets and the results demonstrate its superiority.


2020 ◽  
Author(s):  
Ryohei Sasaki ◽  
Katsumi Konishi ◽  
Tomohiro Takahashi ◽  
Toshihiro Furukawa

Abstract This paper deals with a problem of matrix completion in which each column vector of the matrix belongs to a low-dimensional differentiable manifold (LDDM), with the target matrix being high or full rank. To solve this problem, algorithms based on polynomial mapping and matrix-rank minimization (MRM) have been proposed; such methods assume that each column vector of the target matrix is generated as a vector in a low-dimensional linear subspace (LDLS) and mapped to a p-th order polynomial, and that the rank of a matrix whose column vectors are d-th monomial features of target column vectors is deficient. However, a large number of columns and observed values are needed to strictly solve the MRM problem using this method when p is large; therefore, this paper proposes a new method for obtaining the solution by minimizing the rank of the submatrix without transforming the target matrix, so as to obtain high estimation accuracy even when the number of columns is small. This method is based on the assumption that an LDDM can be approximated locally as an LDLS to achieve high completion accuracy without transforming the target matrix. Numerical examples show that the proposed method has a higher accuracy than other low-rank approaches.


Sign in / Sign up

Export Citation Format

Share Document