Английская Википедия:Circular uniform distribution

Материал из Онлайн справочника
Версия от 04:51, 19 февраля 2024; EducationBot (обсуждение | вклад) (Новая страница: «{{Английская Википедия/Панель перехода}} {{Short description|Probability distribution}} {{one source|date=May 2010}} In probability theory and directional statistics, a '''circular uniform distribution''' is a probability distribution on the unit circle whose density is uniform for all angles. == Description == === Definition === The probability density function (pdf) of the circular uniform distribution, e....»)
(разн.) ← Предыдущая версия | Текущая версия (разн.) | Следующая версия → (разн.)
Перейти к навигацииПерейти к поиску

Шаблон:Short description Шаблон:One source In probability theory and directional statistics, a circular uniform distribution is a probability distribution on the unit circle whose density is uniform for all angles.

Description

Definition

The probability density function (pdf) of the circular uniform distribution, e.g. with <math>\theta\in[0,2\pi)</math>, is:

<math>

f_{UC}(\theta)=\frac{1}{2\pi}. </math>

Moments with respect to a parametrization

We consider the circular variable <math>z=e^{i\theta}</math> with <math>z=1</math> at base angle <math>\theta=0</math>. In these terms, the circular moments of the circular uniform distribution are all zero, except for <math>m_0</math>:

<math>\langle z^n\rangle=\delta_n</math>

where <math>\delta_n</math> is the Kronecker delta symbol.

Descriptive statistics

Here the mean angle is undefined, and the length of the mean resultant is zero.

<math>

R=|\langle z^n\rangle|=0\, </math>

Distribution of the mean

Файл:CircUniformDistOfMean.svg
A 10,000 point Monte Carlo simulation of the distribution of the sample mean of a circular uniform distribution for N = 3
Probability densities for the circular mean magnitude.
Probability densities <math>P_N(\bar{R})</math> for small values of <math>N</math>. Densities for <math>N>3</math> are normalised to the maximum density, those for <math>N=1</math> and <math>2</math> are scaled to aid visibility.

The sample mean of a set of N measurements <math>z_n=e^{i\theta_n}</math> drawn from a circular uniform distribution is defined as:

<math>

\overline{z} = \frac{1}{N}\sum_{n=1}^N z_n = \overline{C}+i\overline{S} = \overline{R}e^{i\overline{\theta}} </math>

where the average sine and cosine are:[1]

<math>

\overline{C}=\frac{1}{N}\sum_{n=1}^N \cos(\theta_n)\qquad\qquad\overline{S}=\frac{1}{N}\sum_{n=1}^N \sin(\theta_n) </math>

and the average resultant length is:

<math>

\overline{R}^2=|\overline{z}|^2=\overline{C}^2+\overline{S}^2 </math>

and the mean angle is:

<math>

\overline{\theta}=\mathrm{Arg}(\overline{z}). \, </math>

The sample mean for the circular uniform distribution will be concentrated about zero, becoming more concentrated as N increases. The distribution of the sample mean for the uniform distribution is given by:[2]

<math>

\frac{1}{(2\pi)^N}\int_\Gamma \prod_{n=1}^N d\theta_n = P(\overline{R})P(\overline{\theta})\,d\overline{R}\,d\overline{\theta} </math>

where <math>\Gamma\,</math> consists of intervals of <math>2\pi</math> in the variables, subject to the constraint that <math>\overline{R}</math> and <math>\overline{\theta}</math> are constant, or, alternatively, that <math>\overline{C}</math> and <math>\overline{S}</math> are constant. The distribution of the angle<math>P(\overline{\theta})</math> is uniform

<math>

P(\overline{\theta})=\frac{1}{2\pi} </math>

and the distribution of <math>\overline{R}</math> is given by:[2]

<math>

P_N(\overline{R})=N^2\overline{R}\int_0^\infty J_0(N\overline{R}\,t)J_0(t)^Nt\,dt </math>

where <math>J_0</math> is the Bessel function of order zero. There is no known general analytic solution for the above integral, and it is difficult to evaluate due to the large number of oscillations in the integrand. A 10,000 point Monte Carlo simulation of the distribution of the mean for N=3 is shown in the figure.

For certain special cases, the above integral can be evaluated:

<math>

P_2(\overline{R})=\frac{2}{\pi \sqrt{1-\overline{R}^2}}. </math>

For large N, the distribution of the mean can be determined from the central limit theorem for directional statistics. Since the angles are uniformly distributed, the individual sines and cosines of the angles will be distributed as:

<math>

P(u)du=\frac{1}{\pi}\,\frac{du}{\sqrt{1-u^2}} </math>

where <math>u=\cos\theta_n\,</math> or <math>\sin\theta_n\,</math>. It follows that they will have zero mean and a variance of 1/2. By the central limit theorem, in the limit of large N, <math>\overline{C}\,</math> and <math>\overline{S}\,</math>, being the sum of a large number of i.i.d's, will be normally distributed with mean zero and variance <math>1/2N</math>. The mean resultant length <math>\overline{R}\,</math>, being the square root of the sum of squares of two normally distributed independent variables, will be Chi-distributed with two degrees of freedom (i.e.Rayleigh-distributed) and variance <math>1/2N</math>:

<math>

\lim_{N\rightarrow\infty}P_N(\overline{R})=2N\overline{R}\,e^{-N\overline{R}^2}. </math>

Entropy

The differential information entropy of the uniform distribution is simply

<math>H_U=-\int_\Gamma \frac{1}{2\pi}\ln\left(\frac{1}{2\pi}\right)\,d\theta = \ln(2\pi)</math>

where <math>\Gamma</math> is any interval of length <math>2\pi</math>. This is the maximum entropy any circular distribution may have.

See also

References

Шаблон:ProbDistributions