Refinement of Jensen’s inequality and estimation of f- and Rényi divergence via Montgomery identity

Jensen’s inequality is important for obtaining inequalities for divergence between probability distribution. By applying a refinement of Jensen’s inequality (Horváth et al. in Math. Inequal. Appl. 14:777–791, 2011) and introducing a new functional based on an f-divergence functional, we obtain some estimates for the new functionals, the f-divergence, and Rényi divergence. Some inequalities for Rényi and Shannon estimates are constructed. The Zipf–Mandelbrot law is used to illustrate the result. In addition, we generalize the refinement of Jensen’s inequality and new inequalities of Rényi Shannon entropies for an m-convex function using the Montgomery identity. It is also given that the maximization of Shannon entropy is a transition from the Zipf–Mandelbrot law to a hybrid Zipf–Mandelbrot law. © 2018, The Author(s).

Авторы
Khan K.A.1 , Niaz T.1, 2 , Pec̆arić Ð.3 , Pec̆arić J. 4
Издательство
Springer International Publishing
Язык
Английский
Статус
Опубликовано
Номер
318
Том
2018
Год
2018
Организации
  • 1 Department of Mathematics, University of Sargodha, Sargodha, Pakistan
  • 2 Department of Mathematics, The University of Lahore, Sargodha, Pakistan
  • 3 Catholic University of Croatia, Zagreb, Croatia
  • 4 Rudn University, Moscow, Russian Federation
Ключевые слова
Entropy; f- and Rényi divergence; Jensen’s inequality; m-convex function; Montgomery identity
Дата создания
04.02.2019
Дата изменения
04.02.2019
Постоянная ссылка
https://repository.rudn.ru/ru/records/article/record/36465/
Поделиться

Другие записи