Finding control policy for one discrete-time Markov chain on [0, 1] with a given invariant measure

A discrete-time Markov chain on the interval [0, 1] with two possible transitions (left or right) at each step has been considerred. The probability of transition towards 0 (and towards 1) is a function of the current value of the chain. Having chosen the direction, the chain moves to the randomly chosen point from the appropriate interval. The authors assume that the transition probabilities depend on the current value of the chain only through a finite number of real-valued numbers. Under this assumption, they seek the transition probabilities, which guarantee the L2 distance between the stationary density of the Markov chain and the given invariant measure on [0, 1] is minimal. Since there is no reward function in this problem, it does not fit in the MDP (Markov decision process) framework. The authors follow the sensitivity-based approach and propose the gradient- and simulation-based method for estimating the parameters of the transition probabilities. Numerical results are presented which show the performance of the method for various transition probabilities and invariant measures on [0, 1]. © 2018 Federal Research Center Computer Science and Control of Russian Academy of Sciences.

Авторы
Konovalov M.G.1 , Razumchik R.V. 1, 2
Издательство
Федеральный исследовательский центр "Информатика и управление" РАН
Номер выпуска
3
Язык
Русский
Страницы
2-13
Статус
Опубликовано
Том
12
Год
2018
Организации
  • 1 Institute of Informatics Problems, Federal Research Center “Computer Science and Control”, Russian Academy of Sciences, 44-2 Vavilov Str., Moscow, 119333, Russian Federation
  • 2 Peoples Friendship University of Russia, RUDN University, 6 Miklukho-Maklaya Str., Moscow, 117198, Russian Federation
Ключевые слова
Continuous state space; Control; Derivative estimation; Markov chain; Sensitivity-based approach
Дата создания
04.02.2019
Дата изменения
04.02.2019
Постоянная ссылка
https://repository.rudn.ru/ru/records/article/record/36526/
Поделиться

Другие записи