Optimal ergodic control of nonlinear stochastic systems
Abstract
We study a class of ergodic stochastic control problems for diffusion processes. We describe the basic ideas concerning the Hamilton-Jacobi-Bellman equation. For a given class of control problems we establish an existence and uniqueness property of the invariant measure. Then we present a numerical approximation to the optimal feedback control based on the discretization of the infinitesimal generator using finite difference schemes. Finally, we apply these techniques to the control of semi-active suspensions for road vehicle.