IEEE Transactions on Automatic Control, Vol.62, No.9, 4467-4482, 2017
Modeling and Control of Stochastic Systems With Poorly Known Dynamics
This paper is concerned with controlling poorly known systems, for which only a simplified and rough model is available for control design. There are many systems that cannot be reasonably probed for the sake of identification, yet they are important for areas such as economy, populations, or medicine. The ideas are developed around an alternative way to account for the bare modeling in a stochastic-based setting, and to heighten the control features for such a modified model. The mathematical framework for the optimal control reveals important features such as the raising of a precautionary feedback policy of "keep the action unchanged" (inaction for short), on a certain state-space region. This feature is not seen in the robust approach, but has been pointed out and permeates part of the economics literature. The control problem relies on the viscosity solution for the Hamilton-Jacobi-Bellman equation, and the value of the problem is shown to be convex. When specialized to the quadratic problem with discounted cost, the exact solution inside the inaction region is given by a Lyapunov type of equation, and asymptotically, for large state values, by a Riccati-like equation. This scenario bridges to the stochastic stability analysis for the controlled model. The single control input is developed in full, part analytically, part numerically, for the scalar case, and an approximation is tested for the multidimensional case. The advantage of the precautionary policy is substantial in some situations.
Keywords:Diffusion processes;modeling;stochastic optimal control;stochastic systems;uncertain systems