Английская Википедия:Cramér's theorem (large deviations)

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables. A weak version of this result was first shown by Harald Cramér in 1938.

Statement

The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:

<math> \Lambda(t)=\log \operatorname E [\exp(tX_1)]. </math>

Let <math> X_1, X_2, \dots </math> be a sequence of iid real random variables with finite logarithmic moment generating function, i.e. <math> \Lambda(t) < \infty </math> for all <math> t \in \mathbb R </math>.

Then the Legendre transform of <math> \Lambda </math>:

<math> \Lambda^*(x):= \sup_{t \in \mathbb R} \left(tx-\Lambda(t) \right) </math>

satisfies,

<math> \lim_{n \to \infty} \frac 1n \log \left(P\left(\sum_{i=1}^n X_i \geq nx \right)\right) = -\Lambda^*(x) </math>

for all <math> x > \operatorname E[X_1]. </math>

In the terminology of the theory of large deviations the result can be reformulated as follows:

If <math> X_1, X_2, \dots </math> is a series of iid random variables, then the distributions <math> \left(\mathcal L ( \tfrac 1n \sum_{i=1}^n X_i) \right)_{n \in \N}</math> satisfy a large deviation principle with rate function <math> \Lambda^* </math>.

References