Английская Википедия:Gauss–Kuzmin distribution

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Probability distribution

In mathematics, the Gauss–Kuzmin distribution is a discrete probability distribution that arises as the limit probability distribution of the coefficients in the continued fraction expansion of a random variable uniformly distributed in (0, 1).[1] The distribution is named after Carl Friedrich Gauss, who derived it around 1800,[2] and Rodion Kuzmin, who gave a bound on the rate of convergence in 1929.[3][4] It is given by the probability mass function

<math> p(k) = - \log_2 \left( 1 - \frac{1}{(1+k)^2}\right)~.</math>

Gauss–Kuzmin theorem

Let

<math> x = \cfrac{1}{k_1 + \cfrac{1}{k_2 + \cdots}} </math>

be the continued fraction expansion of a random number x uniformly distributed in (0, 1). Then

<math> \lim_{n \to \infty} \mathbb{P} \left\{ k_n = k \right\} = - \log_2\left(1 - \frac{1}{(k+1)^2}\right)~.</math>

Equivalently, let

<math> x_n = \cfrac{1}{k_{n+1} + \cfrac{1}{k_{n+2} + \cdots}}~; </math>

then

<math> \Delta_n(s) = \mathbb{P} \left\{ x_n \leq s \right\} - \log_2(1+s) </math>

tends to zero as n tends to infinity.

Rate of convergence

In 1928, Kuzmin gave the bound

<math> |\Delta_n(s)| \leq C \exp(-\alpha \sqrt{n})~. </math>

In 1929, Paul Lévy[5] improved it to

<math> |\Delta_n(s)| \leq C \, 0.7^n~. </math>

Later, Eduard Wirsing showed[6] that, for λ = 0.30366... (the Gauss–Kuzmin–Wirsing constant), the limit

<math> \Psi(s) = \lim_{n \to \infty} \frac{\Delta_n(s)}{(-\lambda)^n} </math>

exists for every s in [0, 1], and the function Ψ(s) is analytic and satisfies Ψ(0) = Ψ(1) = 0. Further bounds were proved by K. I. Babenko.[7]

See also

References

Шаблон:Reflist

Шаблон:ProbDistributions