IEEE Transactions on Automatic Control, Vol.61, No.6, 1619-1624, 2016
Almost Sure Exponential Stabilization by Discrete-Time Stochastic Feedback Control
Given an unstable linear scalar differential equation (x)over dot (t) = alpha x(t) (alpha > 0), we will show that the discrete-time stochastic feedback control sigma x([t/tau]tau)dB(t) can stabilize it. That is, we will show that the stochastically controlled system dx(t) = alpha x(t)dt + sigma x([t/tau]tau)dB(t) is almost surely exponentially stable when sigma(2) > 2 alpha and tau > 0 is sufficiently small, where B(t) is a Brownian motion and [t/tau] is the integer part of t/tau. We will also discuss the nonlinear stabilization problem by a discrete-time stochastic feedback control. The reason why we consider the discrete-time stochastic feedback control is because that the state of the given system is in fact observed only at discrete times, say 0, tau, 2 tau, . . . , for example, where tau > 0 is the duration between two consecutive observations. Accordingly, the stochastic feedback control should be designed based on these discrete-time observations, namely the stochastic feedback control should be of the form sigma x([t/tau]tau)dB(t). From the point of control cost, it is cheaper if one only needs to observe the state less frequently. It is therefore useful to give a bound on tau from below as larger as better.
Keywords:Almost sure exponential stability;Brownian motion;difference equations;discrete-time feedback control;stochastic differential delay equations;stochastic stabilization