SIAM Journal on Control and Optimization, Vol.34, No.4, 1342-1364, 1996
Partially Observed Differential-Games, Infinite-Dimensional Hamilton-Jacobi-Isaacs Equations, and Nonlinear H-Infinity Control
This paper presents new results for partially observed nonlinear differential games. Using the concept of information state, we solve this problem in terms of an infinite-dimensional partial differential equation, which turns out to be the Hamilton-Jacobi-Isaacs (KJI) equation for partially observed differential games. We give definitions of smooth and viscosity solutions and prove that the value function is a viscosity solution of the Hn : equation. We prove a verification theorem, which implies that the optimal controls are separated in that they depend on The observations through the information state. This constitutes a separation principle for partially observed differential games. We also present some new results concerning the certainty equivalence principle under certain standard assumptions. Our results are applied to a nonlinear output feedback H-infinity robust control problem.