Feedback Stabilization of Quasi-Integrable Hamiltonian Systems
A procedure for designing a feedback control to asymptotically stabilize with probability one quasi-integrable Hamiltonian system is proposed. First, a set of averaged Ito^ stochastic differential equations for controlled first integrals is derived from given equations of motion of the system by using the stochastic averaging method for quasi-integrable Hamiltonian systems. Second, a dynamical programming equation for infinite horizon performance index with unknown cost function is established based on the stochastic dynamical programming principle. Third, the asymptotic stability with probability one of the optimally controlled system is analyzed by evaluating the largest Lyapunov exponent of the fully averaged Ito^ equations for the first integrals. Finally, the cost function and feedback control law are determined by the requirement of stabilization of the system. An example is worked out in detail to illustrate the application of the proposed procedure and the effect of optimal control on the stability of the system.