stationary iterative methods
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 5)

H-INDEX

11
(FIVE YEARS 0)

Author(s):  
Dominik Sobania ◽  
Jonas Schmitt ◽  
Harald Köstler ◽  
Franz Rothlauf

AbstractWe introduce GPLS (Genetic Programming for Linear Systems) as a GP system that finds mathematical expressions defining an iteration matrix. Stationary iterative methods use this iteration matrix to solve a system of linear equations numerically. GPLS aims at finding iteration matrices with a low spectral radius and a high sparsity, since these properties ensure a fast error reduction of the numerical solution method and enable the efficient implementation of the methods on parallel computer architectures. We study GPLS for various types of system matrices and find that it easily outperforms classical approaches like the Gauss–Seidel and Jacobi methods. GPLS not only finds iteration matrices for linear systems with a much lower spectral radius, but also iteration matrices for problems where classical approaches fail. Additionally, solutions found by GPLS for small problem instances show also good performance for larger instances of the same problem.


2021 ◽  
Vol 4 (1) ◽  
pp. 53-61
Author(s):  
KJ Audu ◽  
YA Yahaya ◽  
KR Adeboye ◽  
UY Abubakar

Given any linear stationary iterative methods in the form z^(i+1)=Jz^(i)+f, where J is the iteration matrix, a significant improvements of the iteration matrix will decrease the spectral radius and enhances the rate of convergence of the particular method while solving system of linear equations in the form Az=b. This motivates us to refine the Extended Accelerated Over-Relaxation (EAOR) method called Refinement of Extended Accelerated Over-Relaxation (REAOR) so as to accelerate the convergence rate of the method. In this paper, a refinement of Extended Accelerated Over-Relaxation method that would minimize the spectral radius, when compared to EAOR method, is proposed. The method is a 3-parameter generalization of the refinement of Accelerated Over-Relaxation (RAOR) method, refinement of Successive Over-Relaxation (RSOR) method, refinement of Gauss-Seidel (RGS) method and refinement of Jacobi (RJ) method. We investigated the convergence of the method for weak irreducible diagonally dominant matrix, matrix or matrix and presented some numerical examples to check the performance of the method. The results indicate the superiority of the method over some existing methods.


2021 ◽  
Vol 89 ◽  
pp. 87-98
Author(s):  
Ashish Kumar Nandi ◽  
Vaibhav Shekhar ◽  
Nachiketa Mishra ◽  
Debasisha Mishra

Author(s):  
Jakub Kierzkowski

We present the SOR-like methods and highlight some of their known properties. We give the SOR-like method as proposed by Z. Wo źnicki and propose two similar methods based upon it. All three are stationary iterative methods for solving the Sylvester equation (AX-XB=C). We form two sufficient conditions under which one of those methods will converge. In addition, we present a modification method, based on the following fact: if X is a solution of AX-XB=C, it is also a solution of (A-αIm)X-X(B-αIn)=C. We also present numerical experiments to illustrate the theoretical results and some properties of the methods.


2012 ◽  
Vol 4 (04) ◽  
pp. 473-482
Author(s):  
Qun Lin ◽  
Wujian Peng

AbstractAn acceleration scheme based on stationary iterative methods is presented for solving linear system of equations. Unlike Chebyshev semi-iterative method which requires accurate estimation of the bounds for iterative matrix eigenvalues, we use a wide range of Chebyshev-like polynomials for the accelerating process without estimating the bounds of the iterative matrix. A detailed error analysis is presented and convergence rates are obtained. Numerical experiments are carried out and comparisons with classical Jacobi and Chebyshev semi-iterative methods are provided.


Sign in / Sign up

Export Citation Format

Share Document