Solving Riccati differential equations with multilayer neural networks

Author(s):  
Shouling He ◽  
K. Reif ◽  
R. Unbehauen
Author(s):  
Natalia Marchenko ◽  
Ganna Sydorenko ◽  
Roman Rudenko

The article considers the study of methods for numerical solution of systems of differential equations using neural networks. To achieve this goal, thefollowing interdependent tasks were solved: an overview of industries that need to solve systems of differential equations, as well as implemented amethod of solving systems of differential equations using neural networks. It is shown that different types of systems of differential equations can besolved by a single method, which requires only the problem of loss function for optimization, which is directly created from differential equations anddoes not require solving equations for the highest derivative. The solution of differential equations’ system using a multilayer neural networks is thefunctions given in analytical form, which can be differentiated or integrated analytically. In the course of this work, an improved form of constructionof a test solution of systems of differential equations was found, which satisfies the initial conditions for construction, but has less impact on thesolution error at a distance from the initial conditions compared to the form of such solution. The way has also been found to modify the calculation ofthe loss function for cases when the solution process stops at the local minimum, which will be caused by the high dependence of the subsequentvalues of the functions on the accuracy of finding the previous values. Among the results, it can be noted that the solution of differential equations’system using artificial neural networks may be more accurate than classical numerical methods for solving differential equations, but usually takesmuch longer to achieve similar results on small problems. The main advantage of using neural networks to solve differential equations` system is thatthe solution is in analytical form and can be found not only for individual values of parameters of equations, but also for all values of parameters in alimited range of values.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Idris Kharroubi ◽  
Thomas Lim ◽  
Xavier Warin

AbstractWe study the approximation of backward stochastic differential equations (BSDEs for short) with a constraint on the gains process. We first discretize the constraint by applying a so-called facelift operator at times of a grid. We show that this discretely constrained BSDE converges to the continuously constrained one as the mesh grid converges to zero. We then focus on the approximation of the discretely constrained BSDE. For that we adopt a machine learning approach. We show that the facelift can be approximated by an optimization problem over a class of neural networks under constraints on the neural network and its derivative. We then derive an algorithm converging to the discretely constrained BSDE as the number of neurons goes to infinity. We end by numerical experiments.


Author(s):  
C. J. Zúñiga-Aguilar ◽  
J. F. Gómez-Aguilar ◽  
H. M. Romero-Ugalde ◽  
R. F. Escobar-Jiménez ◽  
G. Fernández-Anaya ◽  
...  

Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1159
Author(s):  
Shyam Sundar Santra ◽  
Omar Bazighifan ◽  
Mihai Postolache

In continuous applications in electrodynamics, neural networks, quantum mechanics, electromagnetism, and the field of time symmetric, fluid dynamics, neutral differential equations appear when modeling many problems and phenomena. Therefore, it is interesting to study the qualitative behavior of solutions of such equations. In this study, we obtained some new sufficient conditions for oscillations to the solutions of a second-order delay differential equations with sub-linear neutral terms. The results obtained improve and complement the relevant results in the literature. Finally, we show an example to validate the main results, and an open problem is included.


Sign in / Sign up

Export Citation Format

Share Document