The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.
Sample Chapter(s)
Chapter 1: Basic Notions and Definitions (326 KB)
Contents:
- Basic Notions and Definitions
- Real-Valued HPIV with Finite Number of Controls: Automaton Approach
- Stochastic Approximation
- Minimax Adaptive Control
- Controlled Finite Homogeneous Markov Chains
- Control of Partially Observable Markov Chains and Regenerative Processes
- Control of Markov Processes with Discrete Time and Semi-Markov Processes
- Control of Stationary Processes
- Finite-Converging Procedures for Control Problems with Inequalities
- Control of Linear Difference Equations
- Control of Ordinary Differential Equations
- Control of Stochastic Differential Equations
Readership: Graduate students, researchers and academics in mathematical control theory.
“This book is addressed both to students with a good mathematical background and to researchers and specialists in adaptive control who may find the book inspirational.”
Mathematical Reviews