Английская Википедия:Detrended fluctuation analysis

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

In stochastic processes, chaos theory and time series analysis, detrended fluctuation analysis (DFA) is a method for determining the statistical self-affinity of a signal. It is useful for analysing time series that appear to be long-memory processes (diverging correlation time, e.g. power-law decaying autocorrelation function) or 1/f noise.

The obtained exponent is similar to the Hurst exponent, except that DFA may also be applied to signals whose underlying statistics (such as mean and variance) or dynamics are non-stationary (changing with time). It is related to measures based upon spectral techniques such as autocorrelation and Fourier transform.

Peng et al. introduced DFA in 1994 in a paper that has been cited over 3,000 times as of 2022[1] and represents an extension of the (ordinary) fluctuation analysis (FA), which is affected by non-stationarities.

Definition

Файл:Detrended fluctuation analysis, illustrated with Brownian motion.png
DFA on a Brownian motion process, with increasing values of <math>n</math>.

Algorithm

Given: a time series <math>x_1, x_2, ..., x_N</math>.

Compute its average value <math>\langle x\rangle = \frac 1N \sum_{t=1}^N x_t</math>.

Sum it into a process <math>X_t=\sum_{i=1}^t (x_i-\langle x\rangle)</math>. This is the cumulative sum, or profile, of the original time series. For example, the profile of an i.i.d. white noise is a standard random walk.

Select a set <math>T = \{n_1, ..., n_k\}</math> of integers, such that <math>n_1 < n_2 < \cdots < n_k</math>, the smallest <math>n_1 \approx 4</math>, the largest <math>n_k \approx N</math>, and the sequence is roughly distributed evenly in log-scale: <math>\log(n_2) - \log(n_1) \approx \log(n_3) - \log(n_2) \approx \cdots</math>. In other words, it is approximately a geometric progression.[2]

For each <math>n \in T</math>, divide the sequence <math>X_t</math> into consecutive segments of length <math>n</math>. Within each segment, compute the least squares straight-line fit (the local trend). Let <math>Y_{1,n}, Y_{2,n}, ..., Y_{N,n}</math> be the resulting piecewise-linear fit.

Compute the root-mean-square deviation from the local trend (local fluctuation):<math display="block">F( n, i) = \sqrt{\frac{1}{n}\sum_{t = in+1}^{in+n} \left( X_t - Y_{t, n} \right)^2}.</math>And their root-mean-square is the total fluctuation:

<math>F( n ) = \sqrt{\frac{1}{N/n}\sum_{i = 1}^{N/n} F(n, i)^2}.</math>

(If <math>N</math> is not divisible by <math>n</math>, then one can either discard the remainder of the sequence, or repeat the procedure on the reversed sequence, then take their root-mean-square.[3])

Make the log-log plot <math>\log n - \log F(n)</math>.[4][5]

Interpretation

A straight line of slope <math>\alpha</math> on the log-log plot indicates a statistical self-affinity of form <math>F(n) \propto n^{\alpha}</math>. Since <math>F(n)</math> monotonically increases with <math>n</math>, we always have <math>\alpha > 0</math>.

The scaling exponent <math>\alpha</math> is a generalization of the Hurst exponent, with the precise value giving information about the series self-correlations:

  • <math>\alpha<1/2</math>: anti-correlated
  • <math>\alpha \simeq 1/2</math>: uncorrelated, white noise
  • <math>\alpha>1/2</math>: correlated
  • <math>\alpha\simeq 1</math>: 1/f-noise, pink noise
  • <math>\alpha>1</math>: non-stationary, unbounded
  • <math>\alpha\simeq 3/2</math>: Brownian noise

Because the expected displacement in an uncorrelated random walk of length N grows like <math>\sqrt{N}</math>, an exponent of <math>\tfrac{1}{2}</math> would correspond to uncorrelated white noise. When the exponent is between 0 and 1, the result is fractional Gaussian noise.

Pitfalls in interpretation

Though the DFA algorithm always produces a positive number <math>\alpha</math> for any time series, it does not necessarily imply that the time series is self-similar. Self-similarity requires the log-log graph to be sufficiently linear over a wide range of <math>n</math>. Furthermore, a combination of techniques including MLE, rather than least-squares has been shown to better approximate the scaling, or power-law, exponent.[6]

Also, there are many scaling exponent-like quantities that can be measured for a self-similar time series, including the divider dimension and Hurst exponent. Therefore, the DFA scaling exponent <math>\alpha</math> is not a fractal dimension, and does not have certain desirable properties that the Hausdorff dimension has, though in certain special cases it is related to the box-counting dimension for the graph of a time series.

Generalizations

Generalization to polynomial trends (higher order DFA)

The standard DFA algorithm given above removes a linear trend in each segment. If we remove a degree-n polynomial trend in each segment, it is called DFAn, or higher order DFA.[7]

Since <math>X_t</math> is a cumulative sum of <math>x_t-\langle x\rangle </math>, a linear trend in <math>X_t</math> is a constant trend in <math>x_t-\langle x\rangle </math>, which is a constant trend in <math>x_t </math> (visible as short sections of "flat plateaus"). In this regard, DFA1 removes the mean from segments of the time series <math>x_t </math> before quantifying the fluctuation.

Similarly, a degree n trend in <math>X_t</math> is a degree (n-1) trend in <math>x_t </math>. For example, DFA1 removes linear trends from segments of the time series <math>x_t </math> before quantifying the fluctuation, DFA1 removes parabolic trends from <math>x_t </math>, and so on.

The Hurst R/S analysis removes constant trends in the original sequence and thus, in its detrending it is equivalent to DFA1.

Generalization to different moments (multifractal DFA)

DFA can be generalized by computing<math display="block">F_q( n ) = \left(\frac{1}{N/n}\sum_{i = 1}^{N/n} F(n, i)^q\right)^{1/q}.</math>then making the log-log plot of <math>\log n - \log F_q(n)</math>, If there is a strong linearity in the plot of <math>\log n - \log F_q(n)</math>, then that slope is <math>\alpha(q)</math>.[8] DFA is the special case where <math>q=2</math>.

Multifractal systems scale as a function <math>F_q(n) \propto n^{\alpha(q)}</math>. Essentially, the scaling exponents need not be independent of the scale of the system. In particular, DFA measures the scaling-behavior of the second moment-fluctuations.

Kantelhardt et al. intended this scaling exponent as a generalization of the classical Hurst exponent. The classical Hurst exponent corresponds to <math>H=\alpha(2)</math> for stationary cases, and <math>H=\alpha(2)-1</math> for nonstationary cases.[8][9][10]

Applications

The DFA method has been applied to many systems, e.g. DNA sequences,[11][12] neuronal oscillations,[10] speech pathology detection,[13] heartbeat fluctuation in different sleep stages,[14] and animal behavior pattern analysis.[15]

The effect of trends on DFA has been studied.[16]

Relations to other methods, for specific types of signal

For signals with power-law-decaying autocorrelation

In the case of power-law decaying auto-correlations, the correlation function decays with an exponent <math>\gamma</math>: <math>C(L)\sim L^{-\gamma}\!\ </math>. In addition the power spectrum decays as <math>P(f)\sim f^{-\beta}\!\ </math>. The three exponents are related by:[11]

  • <math>\gamma=2-2\alpha</math>
  • <math>\beta=2\alpha-1</math> and
  • <math>\gamma=1-\beta</math>.

The relations can be derived using the Wiener–Khinchin theorem. The relation of DFA to the power spectrum method has been well studied.[17]

Thus, <math>\alpha</math> is tied to the slope of the power spectrum <math>\beta</math> and is used to describe the color of noise by this relationship: <math>\alpha = (\beta+1)/2</math>.

For fractional Gaussian noise

For fractional Gaussian noise (FGN), we have <math> \beta \in [-1,1] </math>, and thus <math>\alpha \in [0,1]</math>, and <math>\beta = 2H-1</math>, where <math>H</math> is the Hurst exponent. <math>\alpha</math> for FGN is equal to <math>H</math>.[18]

For fractional Brownian motion

For fractional Brownian motion (FBM), we have <math> \beta \in [1,3] </math>, and thus <math>\alpha \in [1,2]</math>, and <math>\beta = 2H+1</math>, where <math>H</math> is the Hurst exponent. <math>\alpha</math> for FBM is equal to <math>H+1</math>.[9] In this context, FBM is the cumulative sum or the integral of FGN, thus, the exponents of their power spectra differ by 2.

See also

References

Шаблон:Reflist

External links