Английская Википедия:Gibbs algorithm
Шаблон:Short description Шаблон:Distinguish Шаблон:One source
In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by minimizing the average log probability
- <math> \langle\ln p_i\rangle = \sum_i p_i \ln p_i \, </math>
subject to the probability distribution Шаблон:Math satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities.[1] in 1948, Claude Shannon interpreted the negative of this quantity, which he called information entropy, as a measure of the uncertainty in a probability distribution.[1] In 1957, E.T. Jaynes realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the principle of maximum entropy and maximum entropy thermodynamics.[1]
Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function).
This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families.
References
Шаблон:Statisticalmechanics-stub