scholarly journals Collocation polynomial neural forms and domain fragmentation for solving initial value problems

Author(s):  
Toni Schneidereit ◽  
Michael Breuß

AbstractSeveral neural network approaches for solving differential equations employ trial solutions with a feedforward neural network. There are different means to incorporate the trial solution in the construction, for instance, one may include them directly in the cost function. Used within the corresponding neural network, the trial solutions define the so-called neural form. Such neural forms represent general, flexible tools by which one may solve various differential equations. In this article, we consider time-dependent initial value problems, which require to set up the neural form framework adequately. The neural forms presented up to now in the literature for such a setting can be considered as first-order polynomials. In this work, we propose to extend the polynomial order of the neural forms. The novel collocation-type construction includes several feedforward neural networks, one for each order. Additionally, we propose the fragmentation of the computational domain into subdomains. The neural forms are solved on each subdomain, whereas the interfacing grid points overlap in order to provide initial values over the whole fragmentation. We illustrate in experiments that the combination of collocation neural forms of higher order and the domain fragmentation allows to solve initial value problems over large domains with high accuracy and reliability.

Author(s):  
J. O. Kuboye ◽  
O. F. Quadri ◽  
O. R. Elusakin

In this work, numerical methods for solving third order initial value problems of ordinary differential equations are developed. Multi-step collocation is used in deriving the methods, where power series approximate solution is employed as a basis function. Gaussian elimination approach is considered in finding the unknown variables $a_j, j=0(1)8$ in interpolation and collocation equations, which are substituted into the power series to give the continuous implicit schemes. The discrete schemes and its derivatives are derived by evaluating the grid and non-grid points. These schemes are arranged in a matrix form to produce block methods. The order of the developed methods are found to be six. The numerical results proved the efficiency of the methods over the existing methods.


Mathematics ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 32 ◽  
Author(s):  
Ravi Agarwal ◽  
Snezhana Hristova ◽  
Donal O’Regan

In this paper, we study Linear Riemann-Liouville fractional differential equations with a constant delay. The initial condition is set up similarly to the case of ordinary derivative. Explicit formulas for the solutions are obtained for various initial functions.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Haidong Qu ◽  
Xuan Liu

We present a new method for solving the fractional differential equations of initial value problems by using neural networks which are constructed from cosine basis functions with adjustable parameters. By training the neural networks repeatedly the numerical solutions for the fractional differential equations were obtained. Moreover, the technique is still applicable for the coupled differential equations of fractional order. The computer graphics and numerical solutions show that the proposed method is very effective.


2016 ◽  
Vol 9 (4) ◽  
pp. 619-639 ◽  
Author(s):  
Zhong-Qing Wang ◽  
Jun Mu

AbstractWe introduce a multiple interval Chebyshev-Gauss-Lobatto spectral collocation method for the initial value problems of the nonlinear ordinary differential equations (ODES). This method is easy to implement and possesses the high order accuracy. In addition, it is very stable and suitable for long time calculations. We also obtain thehp-version bound on the numerical error of the multiple interval collocation method underH1-norm. Numerical experiments confirm the theoretical expectations.


Sign in / Sign up

Export Citation Format

Share Document