화학공학소재연구정보센터
IEEE Transactions on Automatic Control, Vol.40, No.12, 2076-2088, 1995
Discrete-Time Lq-Optimal Control-Problems for Infinite Markov Jump Parameter-Systems
Optimal control problems for discrete-time linear systems subject to Markovian jumps in the parameters are considered for the case in which the Markov chain takes values in a countably infinite set. Two situations are considered : the noiseless case and the case in which an additive noise is appended to the model. The cost functionals to be minimized are the usual infinite time horizon quadratic cost and the long-run average cost, respectively. The solution for these problems relies, in part, on the study of a countably infinite set of coupled algebraic Riccati equations (ICARE). Conditions for existence and uniqueness of a positive semidefinite solution to the ICARE are obtained via the extended concepts of stochastic stabilizability (SS) and stochastic detectability (SD), which turn out to be equivalent to the spectral radius of certain infinite dimensional linear operators in a Banach space being less than one. For the long-run average cost, SS and SD guarantee existence and uniqueness of a stationary measure and consequently existence of an optimal stationary control policy. Furthermore, an extension of a Lyapunov equation result is derived for the countably infinite Markov state-spare case. Finally, a discussion of the case in which we have only partial observation of the Markovian parameter is carried out in the Appendix.