Generalized moment estimation for uncertain differential equations

2021 ◽  
Vol 392 ◽  
pp. 125724
Author(s):  
Z. Liu
2021 ◽  
Vol 24 (5) ◽  
pp. 1445-1476
Author(s):  
Alberto Lastra ◽  
Sławomir Michalik ◽  
Maria Suwińska

Abstract Generalized summability results are obtained regarding formal solutions of certain families of linear moment integro-differential equations with time variable coefficients. The main result leans on the knowledg e of the behavior of the moment derivatives of the elements involved in the problem. A refinement of the main result is also provided giving rise to more accurate results which remain valid in wide families of problems of high interest in practice, such as fractional integro-differential equations.


Author(s):  
Han Tang

The previous uncertain chemical reaction equation describes the time evolution of single reactions. But in many practical cases, a substance is consumed by several different reaction pathways. For the above considerations, this paper extends the discussion to multiple reactions. Specifically, by taking the decomposition of C2H5OH as an example, parallel reactions with one reactant are analyzed with the multifactor uncertain differential equation. The derived equation is called the multifactor uncertain chemical reaction equation. Following that, the parameters in the multifactor uncertain chemical reaction equation are estimated by the generalized moment estimation. Based on the multifactor uncertain chemical reaction equation, half-life of reaction is investigated. Finally, a numerical example is presented to illustrate the usefulness of the multifactor uncertain chemical reaction equation.


Author(s):  
Li He ◽  
Qi Meng ◽  
Wei Chen ◽  
Zhi-Ming Ma ◽  
Tie-Yan Liu

Asynchronous stochastic gradient descent (ASGD) is a popular parallel optimization algorithm in machine learning. Most theoretical analysis on ASGD take a discrete view and prove upper bounds for their convergence rates. However, the discrete view has its intrinsic limitations: there is no characterizationof the optimization path and the proof techniques are induction-based and thus usually complicated. Inspired by the recent successful adoptions of stochastic differential equations (SDE) to the theoretical analysis of SGD, in this paper, we study the continuous approximation of ASGD by using stochastic differential delay equations (SDDE). We introduce the approximation method and study the approximation error. Then we conduct theoretical analysis on the convergence rate of ASGD algorithm based on the continuous approximation.There are two methods: moment estimation and energy function minimization can be used to analyzethe convergence rates. Moment estimation depends on the specific form of the loss function, while energy function minimization only leverages the convex property of the loss function, and does not depend on its specific form. In addition to the convergence analysis, the continuous view also helps us derive better convergence rates. All of this clearly shows the advantage of taking the continuous view in gradient descent algorithms.


Sign in / Sign up

Export Citation Format

Share Document