Chapter 8 focuses on nonlinear optimal control and its applications. The chapter begins by introducing the fundamentals of optimal control and prototypical problem formulations. This is followed by the treatment of first-order necessary conditions including the Pontryagin minimum principle, dynamic programming, and the Hamilton–Jacobi–Bellman equation. Singular arcs and bang–bang controls are relevant in the solution of many minimum-time and minimum-fuel problems and so these issues are discussed with the help of examples that have been worked out in detail.This chapter then turns towards direct and indirect numericalmethods suitable for solving large-scale optimal control problems numerically.The chapter concludes with an example relating to the calculation of an optimal track curvature estimate from global positioning system (GPS) data.