- Previous Article
- Next Article
- Table of Contents
Applied Mathematics and Optimization, Vol.43, No.2, 169-186, 2001
An auxiliary equation for the Bellman equation in a one-dimensional ergodic control
In this paper we consider the Bellman equation in a one-dimensional ergodic control. Our aim is to show the existence and the uniqueness of its solution under general assumptions. For this purpose pie introduce an auxiliary equation whose solution gives the invariant measure of the diffusion corresponding to an optimal control. Using this solution, we construct a solution to the Bellman equation. Our method of using this auxiliary equation has two advantages in the one-dimensional case. First, we can solve the Bellman equation under general assumptions. Second, this auxiliary equation gives an optimal Markov control explicitly in many examples.