Entropy Dimension Reduction Method for Randomized Machine Learning Problems

The direct and inverse projections (DIP) method was proposed to reduce the feature space to the given dimensions oriented to the problems of randomized machine learning and based on the procedure of “direct” and “inverse” design. The “projector” matrices are determined by maximizing the relative entropy. It is suggested to estimate the information losses by the absolute error calculated with the use of the Kullback–Leibler function (SRC method). An example illustrating these methods was given. © 2018, Pleiades Publishing, Ltd.

Authors
Popkov Y.S. 1, 2, 3 , Dubnov Y.A. 1, 3, 4 , Popkov A.Y. 1, 5
Publisher
Maik Nauka Publishing / Springer SBM
Number of issue
11
Language
English
Pages
2038-2051
Status
Published
Volume
79
Year
2018
Organizations
  • 1 Institute for Systems Analysis, Russian Academy of Sciences, Federal Research Center “Informatics and Control,”, Moscow, Russian Federation
  • 2 Braude College of Haifa University, Carmiel, Israel
  • 3 National Research University “Higher School of Economics,”, Moscow, Russian Federation
  • 4 Moscow Institute of Physics and Technology, Moscow, Russian Federation
  • 5 Peoples’ Friendship University, Moscow, Russian Federation
Keywords
direct and inverse projections; entropy; gradient method; matrix derivatives; projection operators; relative entropy

Other records