Applied Mathematics and Optimization, Vol.40, No.1, 79-103, 1999
Maximum principle for implicit control systems
In this paper we derive optimality conditions in the form of a maximum principle for an optimal control problem for nonlinear implicit, or descriptor, control systems. The regularity conditions which are imposed on the system allow us to reduce the optimal control problem to an equivalent nonsmooth variational one. Nonsmooth analysis techniques together with the sliding variations of relaxed controls from optimal control are used to obtain necessary conditions.