IEEE Transactions on Automatic Control, Vol.47, No.11, 1909-1914, 2002
Optimal control of discrete-time uncertain systems with imperfect measurement
We consider a discrete-time optimal control problem in the presence of uncertainties in the dynamics and in the measurement. An optimal (in min-max sense) output-feedback controller is derived from a discrete Bellman equation. We propose a formulation of the last equation in the estimation space, which allows approximation by a reduced-order equation corresponding to a prescribed collection of estimation sets. The approach is illustrated up to numerical experiments by a pursuit-evasion game on the plane, with incomplete information of the current state of the evader.
Keywords:AMS classification 90D50,93B52,93C55;Bellman-Isaacs equation;discrete-time differential game;dynamic programming;imperfect measurement;optimal control;uncertain systems