Mathematical Methods in Optimization of Differential Systems
[Book]
by Viorel Barbu.
Dordrecht
Springer Netherlands
1994
(x, 262 pages)
Mathematics and Its Applications, 310.
I: Generalized Gradients and Optimality --; 1. Fundamentals of Convex Analysis --; 2. Generalized Gradients --; 3. The Ekeland Variational Principle --; References --; II: Optimal Control of Ordinary Differential Systems --; 1. Formulation of the Problem and Existence --; 2. The Maximum Principle --; 3. Applications of the Maximum Principle --; References --; III: The Dynamic Programming Method --; 1. The Dynamic Programming Equation --; 2. Variational and Viscosity Solutions to the Equation of Dynamic Programming --; 3. Constructive Approaches to Synthesis Problem --; References --; IV: Optimal Control of Parameter Distributed Systems --; 1. General Description of Parameter Distributed Systems --; 2. Optimal Convex Control Problems --; 3. The H? -Control Problem --; 4. Optimal Control of Nonlinear Parameter Distributed Systems --; References.
This volume is concerned with optimal control problems governed by ordinary differential systems and partial differential equations. The emphasis is on first-order necessary conditions of optimality and the construction of optimal controllers in feedback forms. These subjects are treated using some new concepts and techniques in modern optimization theory, such as Clarke's generalized gradient, Ekeland's variational principle, viscosity solution to the Hamilton--Jacobi equation, and smoothing processes for optimal control problems governed by variational inequalities. A substantial part of this book is devoted to applications and examples. A background in advanced calculus will enable readers to understand most of this book, including the statement of the Pontriagin maximum principle and many of the applications. This work will be of interest to graduate students in mathematics and engineering, and researchers in applied mathematics, control theory and systems theory.