scholarly journals The maximum principle for a type of hereditary semilinear differential equation

Author(s):  
Feiyue He

AbstractAn optimal control problem governed by a class of delay semilinear differential equations is studied. The existence of an optimal control is proven, and the maximum principle and approximating schemes are found. As applications, three examples are discussed.

Author(s):  
Shahla Rasulzade ◽  
◽  

One specific optimal control problem with distributed parameters of the Moskalenko type with a multipoint quality functional is considered. To date, the theory of necessary first-order optimality conditions such as the Pontryagin maximum principle or its consequences has been sufficiently developed for various optimal control problems described by ordinary differential equations, i.e. for optimal control problems with lumped parameters. Many controlled processes are described by various partial differential equations (processes with distributed parameters). Some features are inherent in optimal control problems with distributed parameters, and therefore, when studying the optimal control problem with distributed parameters, in particular, when deriving various necessary optimality conditions, non-trivial difficulties arise. In particular, in the study of cases of degeneracy of the established necessary optimality conditions, fundamental difficulties arise. In the present work, we study one optimal control problem described by a system of first-order partial differential equations with a controlled initial condition under the assumption that the initial function is a solution to the Cauchy problem for ordinary differential equations. The objective function (quality criterion) is multi-point. Therefore, it becomes necessary to introduce an unconventional conjugate equation, not in differential (classical), but in integral form. In the work, using one version of the increment method, using the explicit linearization method of the original system, the necessary optimality condition is proved in the form of an analog of the maximum principle of L.S. Pontryagin. It is known that the maximum principle of L.S. Pontryagin for various optimal control problems is the strongest necessary condition for optimality. But the principle of a maximum of L.S. Pontryagin, being a necessary condition of the first order, often degenerates. Such cases are called special, and the corresponding management, special management. Based on these considerations, in the considered problem, we study the case of degeneration of the maximum principle of L.S. Pontryagin for the problem under consideration. For this purpose, a formula for incrementing the quality functional of the second order is constructed. By introducing auxiliary matrix functions, it was possible to obtain a second-order increment formula that is constructive in nature. The necessary optimality condition for special controls in the sense of the maximum principle of L.S. Pontryagin is proved. The proved necessary optimality conditions are explicit.


2019 ◽  
Vol 25 (1) ◽  
pp. 1 ◽  
Author(s):  
Carlos Campos ◽  
Cristiana J. Silva ◽  
Delfim F. M. Torres

We provide easy and readable GNU Octave/MATLAB code for the simulation of mathematical models described by ordinary differential equations and for the solution of optimal control problems through Pontryagin’s maximum principle. For that, we consider a normalized HIV/AIDS transmission dynamics model based on the one proposed in our recent contribution (Silva, C.J.; Torres, D.F.M. A SICA compartmental model in epidemiology with application to HIV/AIDS in Cape Verde. Ecol. Complex. 2017, 30, 70–75), given by a system of four ordinary differential equations. An HIV initial value problem is solved numerically using the ode45 GNU Octave function and three standard methods implemented by us in Octave/MATLAB: Euler method and second-order and fourth-order Runge–Kutta methods. Afterwards, a control function is introduced into the normalized HIV model and an optimal control problem is formulated, where the goal is to find the optimal HIV prevention strategy that maximizes the fraction of uninfected HIV individuals with the least HIV new infections and cost associated with the control measures. The optimal control problem is characterized analytically using the Pontryagin Maximum Principle, and the extremals are computed numerically by implementing a forward-backward fourth-order Runge–Kutta method. Complete algorithms, for both uncontrolled initial value and optimal control problems, developed under the free GNU Octave software and compatible with MATLAB are provided along the article.


2020 ◽  
Vol 28 (1) ◽  
pp. 1-18
Author(s):  
Dahbia Hafayed ◽  
Adel Chala

AbstractIn this paper, we are concerned with an optimal control problem where the system is driven by a backward doubly stochastic differential equation with risk-sensitive performance functional. We generalized the result of Chala [A. Chala, Pontryagin’s risk-sensitive stochastic maximum principle for backward stochastic differential equations with application, Bull. Braz. Math. Soc. (N. S.) 48 2017, 3, 399–411] to a backward doubly stochastic differential equation by using the same contribution of Djehiche, Tembine and Tempone in [B. Djehiche, H. Tembine and R. Tempone, A stochastic maximum principle for risk-sensitive mean-field type control, IEEE Trans. Automat. Control 60 2015, 10, 2640–2649]. We use the risk-neutral model for which an optimal solution exists as a preliminary step. This is an extension of an initial control system in this type of problem, where an admissible controls set is convex. We establish necessary as well as sufficient optimality conditions for the risk-sensitive performance functional control problem. We illustrate the paper by giving two different examples for a linear quadratic system, and a numerical application as second example.


Author(s):  
Tatyana Komleva ◽  
Liliya Plotnikova ◽  
Natalia Skripnik ◽  
Andrej Plotnikov

The article presents some definitions of derivatives for set-valued mappings and their properties. A linear set-valued differential equation is considered and conditions for the existence of basic solutions are given. Subsequently, one optimal control problem is considered, when the system behavior is described by linear set-valued differential equations.


2008 ◽  
Vol 08 (01) ◽  
pp. 23-33 ◽  
Author(s):  
LAURENT MAZLIAK ◽  
IVAN NOURDIN

In this note, we consider an optimal control problem associated to a differential equation driven by a Hölder continuous function g of index β > 1/2. We split our study into two cases. If the coefficient of dgt does not depend on the control process, we prove an existence theorem for a slightly generalized control problem, that is we obtain a literal extension of the corresponding situation for ordinary differential equations. If the coefficient of dgt depends on the control process, we also prove an existence theorem but here we are obliged to restrict the set of controls to sufficiently regular functions.


Sign in / Sign up

Export Citation Format

Share Document