Английская Википедия:Bennett's inequality

Материал из Онлайн справочника
Версия от 03:58, 8 февраля 2024; EducationBot (обсуждение | вклад) (Новая страница: «{{Английская Википедия/Панель перехода}} In probability theory, '''Bennett's inequality''' provides an upper bound on the probability that the sum of independent random variables deviates from its expected value by more than any specified amount. Bennett's inequality was proved by George Bennett of the University of New South Wales in 1962.<ref name=bennett>{{Cite journal |...»)
(разн.) ← Предыдущая версия | Текущая версия (разн.) | Следующая версия → (разн.)
Перейти к навигацииПерейти к поиску

In probability theory, Bennett's inequality provides an upper bound on the probability that the sum of independent random variables deviates from its expected value by more than any specified amount. Bennett's inequality was proved by George Bennett of the University of New South Wales in 1962.[1]

Statement

Let Шаблон:Math be independent random variables with finite variance. Further assume Шаблон:Math almost surely for all Шаблон:Math, and define <math> S_n = \sum_{i = 1}^n \left[X_i - \operatorname{E}(X_i)\right] </math> and <math> \sigma^2 = \sum_{i=1}^n \operatorname{E}(X_i-\operatorname{E} X_i)^2.</math> Then for any Шаблон:Math,

<math>\Pr\left( S_n > t \right) \leq

\exp\left( - \frac{\sigma^2}{a^2} h\left(\frac{at}{\sigma^2} \right)\right),</math>

where Шаблон:Math and log denotes the natural logarithm.[2][3]

Generalizations and comparisons to other bounds

For generalizations see Freedman (1975)[4] and Fan, Grama and Liu (2012)[5] for a martingale version of Bennett's inequality and its improvement, respectively.

Hoeffding's inequality only assumes the summands are bounded almost surely, while Bennett's inequality offers some improvement when the variances of the summands are small compared to their almost sure bounds. However Hoeffding's inequality entails sub-Gaussian tails, whereas in general Bennett's inequality has Poissonian tails.Шаблон:Cn

Bennett's inequality is most similar to the Bernstein inequalities, the first of which also gives concentration in terms of the variance and almost sure bound on the individual terms. Bennett's inequality is stronger than this bound, but more complicated to compute.[3]

In both inequalities, unlike some other inequalities or limit theorems, there is no requirement that the component variables have identical or similar distributions.Шаблон:Cn

Example

Suppose that each Шаблон:Math is an independent binary random variable with probability Шаблон:Math. Then Bennett's inequality says that:

<math>\Pr\left( \sum_{i = 1}^n X_i > pn + t \right) \leq

\exp\left( - np h\left(\frac{t}{np}\right)\right).</math> For <math>t \geq 10 np</math>, <math>h(\frac{t}{np}) \geq \frac{t}{2np} \log \frac{t}{np},</math> so

<math>\Pr\left( \sum_{i = 1}^n X_i > pn + t \right) \leq

\left(\frac{t}{np}\right)^{-t/2}</math> for <math>t \geq 10 np</math>.

By contrast, Hoeffding's inequality gives a bound of <math>\exp(-2 t^2/n)</math> and the first Bernstein inequality gives a bound of <math>\exp(-\frac{t^2}{2np + 2t/3})</math>. For <math>t \gg np</math>, Hoeffding's inequality gives <math>\exp(-\Theta(t^2/n))</math>, Bernstein gives <math>\exp(-\Theta(t))</math>, and Bennett gives <math>\exp(-\Theta(t \log \frac{t}{np}))</math>.

See also

References


Шаблон:Probability-stub