화학공학소재연구정보센터
SIAM Journal on Control and Optimization, Vol.40, No.6, 1724-1745, 2002
Generalized solutions in nonlinear stochastic control problems
An optimal stochastic control problem is considered for systems with unbounded controls satisfying an integral constraint. It is shown that there exists an optimal control within the class of generalized controls leading to impulse actions. Applying an approach of time transformation, developed recently for deterministic systems, the original control problem is shown to be equivalent to an optimal stopping problem. Moreover, the description of generalized solutions is given in terms of stochastic differential equations governed by a measure.