Generalized informational entropy and noncanonical distribution in equilibrium statistical mechanics

Based on the Jaynes principle of maximum for informational entropy, we find a generalized probability distribution and construct a generalized equilibrium statistical mechanics (ESM) for a wide class of objects to which the usual (canonical) ESM cannot be applied. We consistently consider the case of a continuous, not discrete, random variable characterizing the state of the object. For large values of the argument, the resulting distribution is characterized by a power-law, not exponential, asymptotic behavior, and the corresponding power asymptotic expression agrees with the empirical laws established for these objects. The ε-deformed Boltzmann-Gibbs-Shannon functional satisfying the requirements of the entropy axiomatics and leading to the canonical ESM for ε = 0 is used as the original entropy functional. We also consider nonlinear transformations of this functional. We show that depending on how the averages of the dynamical characteristics of the object are defined, the different (Tsallis, Renyi, and Hardy-Littlewood-Pólya.) versions of the generalized ESM can be used, and we give their comparative analysis. We find conditions under which the Gibbs-Helmholtz thermodynamic relations hold and the Legendre transformation can be applied to the generalized entropy and the Massieu-Planck function. We consider the Tsallis and Renyi ESM versions in detail for the case of a one-dimensional probabilistic object with a single dynamical characteristic whose role is played by a generalized positive "energy" with a monotonic power growth. We obtain constraints on the Renyi index under which the equilibrium distribution relates to a definite class of stable Gaussian or Levy-Khinchin distributions.

Авторы
Номер выпуска
1
Язык
Английский
Страницы
451-496
Статус
Опубликовано
Том
135
Год
2003
Организации
  • 1 Peoples' Friendship Univ. of Russia, Moscow, Russian Federation
Ключевые слова
Equilibrium statistical mechanics; Jaynes maximum entropy principle; Levy-Khinchin distribution; Renyi entropy; Shannon entropy; Tsallis entropy
Цитировать
Поделиться

Другие записи