Entropy Dimension Reduction Method for Randomized Machine Learning Problems

The direct and inverse projections (DIP) method was proposed to reduce the feature space to the given dimensions oriented to the problems of randomized machine learning and based on the procedure of “direct” and “inverse” design. The “projector” matrices are determined by maximizing the relative entropy. It is suggested to estimate the information losses by the absolute error calculated with the use of the Kullback–Leibler function (SRC method). An example illustrating these methods was given. © 2018, Pleiades Publishing, Ltd.

Авторы
Popkov Y.S. 1, 2, 3 , Dubnov Y.A. 1, 3, 4 , Popkov A.Y. 1, 5
Издательство
Maik Nauka Publishing / Springer SBM
Номер выпуска
11
Язык
Английский
Страницы
2038-2051
Статус
Опубликовано
Том
79
Год
2018
Организации
  • 1 Institute for Systems Analysis, Russian Academy of Sciences, Federal Research Center “Informatics and Control,”, Moscow, Russian Federation
  • 2 Braude College of Haifa University, Carmiel, Israel
  • 3 National Research University “Higher School of Economics,”, Moscow, Russian Federation
  • 4 Moscow Institute of Physics and Technology, Moscow, Russian Federation
  • 5 Peoples’ Friendship University, Moscow, Russian Federation
Ключевые слова
direct and inverse projections; entropy; gradient method; matrix derivatives; projection operators; relative entropy
Дата создания
04.02.2019
Дата изменения
04.02.2019
Постоянная ссылка
https://repository.rudn.ru/ru/records/article/record/36263/
Поделиться

Другие записи