forward euler
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 13)

H-INDEX

6
(FIVE YEARS 2)

Author(s):  
Yue Li ◽  
Hongjun Cao

In this paper, a discrete-time Hindmarsh-Rose model is obtained by a nonstandard finite difference (NSFD) scheme. Bifurcation behaviors between the model obtained by the forward Euler scheme and the model obtained by the NSFD scheme are compared. Through analytical and numerical comparisons, much more bifurcations and dynamical behaviors can be obtained and preserved by using the NSFD scheme, in which the integral step size can be chosen larger relatively due to its better stability and convergence than those in the forward Euler scheme. It means that the discretetime model obtained by the NSFD scheme is closer to the original continuous system than the discrete-time model obtained by the forward Euler scheme. These confirmed results can at least guarantee true available numerical results to investigate complex neuron dynamical systems.


2020 ◽  
Vol 48 (4) ◽  
pp. 987-1003
Author(s):  
Hans Georg Bock ◽  
Jürgen Gutekunst ◽  
Andreas Potschka ◽  
María Elena Suaréz Garcés

AbstractJust as the damped Newton method for the numerical solution of nonlinear algebraic problems can be interpreted as a forward Euler timestepping on the Newton flow equations, the damped Gauß–Newton method for nonlinear least squares problems is equivalent to forward Euler timestepping on the corresponding Gauß–Newton flow equations. We highlight the advantages of the Gauß–Newton flow and the Gauß–Newton method from a statistical and a numerical perspective in comparison with the Newton method, steepest descent, and the Levenberg–Marquardt method, which are respectively equivalent to Newton flow forward Euler, gradient flow forward Euler, and gradient flow backward Euler. We finally show an unconditional descent property for a generalized Gauß–Newton flow, which is linked to Krylov–Gauß–Newton methods for large-scale nonlinear least squares problems. We provide numerical results for large-scale problems: An academic generalized Rosenbrock function and a real-world bundle adjustment problem from 3D reconstruction based on 2D images.


2020 ◽  
Vol 20 (4) ◽  
pp. 717-725 ◽  
Author(s):  
Vidar Thomée

AbstractFor a spatially periodic convection-diffusion problem, we analyze a time stepping method based on Lie splitting of a spatially semidiscrete finite element solution on time steps of length k, using the backward Euler method for the diffusion part and a stabilized explicit forward Euler approximation on {m\geq 1} intervals of length {k/m} for the convection part. This complements earlier work on time splitting of the problem in a finite difference context.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Martina Bukač ◽  
Catalin Trenchea

AbstractWe propose a BOundary Update using Resolvent (BOUR) partitioned method, second-order accurate in time, unconditionally stable, for the interaction between a viscous, incompressible fluid and a thin structure. The method is algorithmically similar to the sequential Backward Euler - Forward Euler implementation of the midpoint quadrature rule. (i) The structure and fluid sub-problems are first solved using a Backward Euler scheme, (ii) the velocities of fluid and structure are updated on the boundary via a second-order consistent resolvent operator, and then (iii) the structure and fluid sub-problems are solved again, using a Forward Euler scheme. The stability analysis based on energy estimates shows that the scheme is unconditionally stable. Error analysis of the semi-discrete problem yields second-order convergence in time. The two numerical examples confirm theoretical convergence analysis results and show an excellent agreement between the proposed partitioned scheme and the monolithic scheme.


Entropy ◽  
2020 ◽  
Vol 22 (3) ◽  
pp. 352
Author(s):  
Fengnan Liu ◽  
Yasuhide Fukumoto ◽  
Xiaopeng Zhao

A stable explicit difference scheme, which is based on forward Euler format, is proposed for the Richards equation. To avoid the degeneracy of the Richards equation, we add a perturbation to the functional coefficient of the parabolic term. In addition, we introduce an extra term in the difference scheme which is used to relax the time step restriction for improving the stability condition. With the augmented terms, we prove the stability using the induction method. Numerical experiments show the validity and the accuracy of the scheme, along with its efficiency.


2020 ◽  
Vol 36 (1) ◽  
pp. 45-47
Author(s):  
CHEN TAO ◽  
HUANG NAN-JING ◽  
XIAO YI-BIN

In this paper, we obtain an existence and uniqueness of the solution for a class of parabolic evolutionary quasivariational inequalities in contact mechanics under some mild conditions. We also study an error estimate for the parabolic evolutionary quasivariational inequality by employing the forward Euler difference scheme and the element-free Galerkin spatial approximation.


Mathematics ◽  
2019 ◽  
Vol 7 (10) ◽  
pp. 935 ◽  
Author(s):  
Simone Fiori

The present paper recalls a formulation of non-conservative system dynamics through the Lagrange–d’Alembert principle expressed through a generalized Euler–Poincaré form of the system equation on a Lie group. The paper illustrates applications of the generalized Euler–Poincaré equations on the rotation groups to a gyrostat satellite and a quadcopter drone. The numerical solution of the dynamical equations on the rotation groups is tackled via a generalized forward Euler method and an explicit Runge–Kutta integration method tailored to Lie groups.


Author(s):  
Amir Gholaminejad ◽  
Kurt Keutzer ◽  
George Biros

Residual neural networks can be viewed as the forward Euler discretization of an Ordinary Differential Equation (ODE) with a unit time step. This has recently motivated researchers to explore other discretization approaches and train ODE based networks. However, an important challenge of neural ODEs is their prohibitive memory cost during gradient backpropogation. Recently a method proposed in arXiv:1806.07366, claimed that this memory overhead can be reduced from LNt, where Nt is the number of time steps, down to O(L) by solving forward ODE backwards in time, where L is the depth of the network. However, we will show that this approach may lead to several problems: (i) it may be numerically unstable for ReLU/non-ReLU activations and general convolution operators, and (ii) the proposed optimize-then-discretize approach may lead to divergent training due to inconsistent gradients for small time step sizes. We discuss the underlying problems, and to address them we propose ANODE, a neural ODE framework which avoids the numerical instability related problems noted above. ANODE has a memory footprint of O(L) + O(Nt), with the same computational cost as reversing ODE solve. We furthermore, discuss a memory efficient algorithm which can further reduce this footprint with a tradeoff of additional computational cost. We show results on Cifar-10/100 datasets using ResNet and SqueezeNext neural networks.


Sign in / Sign up

Export Citation Format

Share Document