Английская Википедия:Hurst exponent
Шаблон:Short description The Hurst exponent is used as a measure of long-term memory of time series. It relates to the autocorrelations of the time series, and the rate at which these decrease as the lag between pairs of values increases. Studies involving the Hurst exponent were originally developed in hydrology for the practical matter of determining optimum dam sizing for the Nile river's volatile rain and drought conditions that had been observed over a long period of time.[1][2] The name "Hurst exponent", or "Hurst coefficient", derives from Harold Edwin Hurst (1880–1978), who was the lead researcher in these studies; the use of the standard notation H for the coefficient also relates to his name.
In fractal geometry, the generalized Hurst exponent has been denoted by H or Hq in honor of both Harold Edwin Hurst and Ludwig Otto Hölder (1859–1937) by Benoît Mandelbrot (1924–2010).[3] H is directly related to fractal dimension, D, and is a measure of a data series' "mild" or "wild" randomness.[4]
The Hurst exponent is referred to as the "index of dependence" or "index of long-range dependence". It quantifies the relative tendency of a time series either to regress strongly to the mean or to cluster in a direction.[5] A value H in the range 0.5–1 indicates a time series with long-term positive autocorrelation, meaning that the decay in autocorrelation is slower than exponential, following a power law; for the series it means that a high value tends to be followed by another high value and that future excursions to more high values do occur. A value in the range 0 – 0.5 indicates a time series with long-term switching between high and low values in adjacent pairs, meaning that a single high value will probably be followed by a low value and that the value after that will tend to be high, with this tendency to switch between high and low values lasting a long time into the future, also following a power law. A value of H=0.5 indicates short-memory, with (absolute) autocorrelations decaying exponentially quickly to zero.
Definition
The Hurst exponent, H, is defined in terms of the asymptotic behaviour of the rescaled range as a function of the time span of a time series as follows;[6][7]
<math display="block">\mathbb{E} \left [ \frac{R(n)}{S(n)} \right ]=C n^H \text{ as } n \to \infty \, ,</math> where
- <math>R(n)</math> is the range of the first <math>n</math> cumulative deviations from the mean
- <math>S(n)</math> is the series (sum) of the first n standard deviations
- <math>\mathbb{E} \left [x \right ] \,</math> is the expected value
- <math>n</math> is the time span of the observation (number of data points in a time series)
- <math>C</math> is a constant.
Relation to Fractal Dimension
For self-similar time series, H is directly related to fractal dimension, D, where 1 < D < 2, such that D = 2 - H. The values of the Hurst exponent vary between 0 and 1, with higher values indicating a smoother trend, less volatility, and less roughness.[8]
For more general time series or multi-dimensional process, the Hurst exponent and fractal dimension can be chosen independently, as the Hurst exponent represents structure over asymptotically longer periods, while fractal dimension represents structure over asymptotically shorter periods.[9]
Estimating the exponent
A number of estimators of long-range dependence have been proposed in the literature. The oldest and best-known is the so-called rescaled range (R/S) analysis popularized by Mandelbrot and Wallis[3][10] and based on previous hydrological findings of Hurst.[1] Alternatives include DFA, Periodogram regression,[11] aggregated variances,[12] local Whittle's estimator,[13] wavelet analysis,[14][15] both in the time domain and frequency domain.
Rescaled range (R/S) analysis
To estimate the Hurst exponent, one must first estimate the dependence of the rescaled range on the time span n of observation.[7] A time series of full length N is divided into a number of nonoverlapping shorter time series of length n, where n takes values N, N/2, N/4, ... (in the convenient case that N is a power of 2). The average rescaled range is then calculated for each value of n.
For each such time series of length <math>n</math>, <math>X=X_1,X_2,\dots, X_n \, </math>, the rescaled range is calculated as follows:[6][7]
- Calculate the mean; <math display="block">m=\frac{1}{n} \sum_{i=1}^{n} X_i \,.</math>
- Create a mean-adjusted series; <math display="block">Y_t=X_{t}-m \quad \text{ for } t=1,2, \dots ,n \,. </math>
- Calculate the cumulative deviate series <math>Z</math>; <math display="block">Z_t = \sum_{i=1}^{t} Y_{i} \quad \text{ for } t=1,2, \dots ,n \,. </math>
- Compute the range <math>R</math>; <math display="block"> R(n) =\operatorname{max}\left (Z_1, Z_2, \dots, Z_n \right )-
\operatorname{min}\left (Z_1, Z_2, \dots, Z_n \right ). </math>
- Compute the standard deviation <math>S</math>; <math display="block">S(n)= \sqrt{\frac{1}{n} \sum_{i=1}^{n}\left ( X_{i} - m \right )^{2}}. </math>
- Calculate the rescaled range <math>R(n)/S(n)</math> and average over all the partial time series of length <math>n.</math>
The Hurst exponent is estimated by fitting the power law <math>\mathbb{E} [ R(n)/S(n)] = C n^H</math> to the data. This can be done by plotting <math>\log[R(n)/S(n)]</math> as a function of <math>\log n</math>, and fitting a straight line; the slope of the line gives <math>H</math>. A more principled approach would be to fit the power law in a maximum-likelihood fashion.[16] Such a graph is called a box plot. However, this approach is known to produce biased estimates of the power-law exponent.Шаблон:What For small <math>n</math> there is a significant deviation from the 0.5 slope.Шаблон:What Anis and Lloyd[17] estimated the theoretical (i.e., for white noise)Шаблон:What values of the R/S statistic to be:
<math display="block">\mathbb{E} [ R(n)/S(n) ] = \begin{cases} \frac{\Gamma(\frac{n-1}{2})}{\sqrt{\pi} \Gamma(\frac{n}{2})}
\sum\limits_{i=1}^{n-1} \sqrt{\frac{n-i}{i}}, & \text{for }n\le 340 \\ \frac{1}{\sqrt{n\frac{\pi}{2}}} \sum\limits_{i=1}^{n-1} \sqrt{\frac{n-i}{i}}, & \text{for }n>340
\end{cases}</math>
where <math>\Gamma</math> is the Euler gamma function.Шаблон:What The Anis-Lloyd corrected R/S Hurst exponentШаблон:What is calculated as 0.5 plus the slope of <math> R(n)/S(n) - \mathbb{E}[ R(n)/S(n)]</math>.
Confidence intervals
No asymptotic distribution theory has been derived for most of the Hurst exponent estimators so far. However, Weron[18] used bootstrapping to obtain approximate functional forms for confidence intervals of the two most popular methods, i.e., for the Anis-Lloyd[17] corrected R/S analysis:
Level | Lower bound | Upper bound |
---|---|---|
90% | 0.5 − exp(−7.35 log(log M) + 4.06) | exp(−7.07 log(log M) + 3.75) + 0.5 |
95% | 0.5 − exp(−7.33 log(log M) + 4.21) | exp(−7.20 log(log M) + 4.04) + 0.5 |
99% | 0.5 − exp(−7.19 log(log M) + 4.34) | exp(−7.51 log(log M) + 4.58) + 0.5 |
and for DFA:
Level | Lower bound | Upper bound |
---|---|---|
90% | 0.5 − exp(−2.99 log M + 4.45) | exp(−3.09 log M + 4.57) + 0.5 |
95% | 0.5 − exp(−2.93 log M + 4.45) | exp(−3.10 log M + 4.77) + 0.5 |
99% | 0.5 − exp(−2.67 log M + 4.06) | exp(−3.19 log M + 5.28) + 0.5 |
Here <math>M = \log_2 N</math> and <math>N</math> is the series length. In both cases only subseries of length <math>n > 50</math> were considered for estimating the Hurst exponent; subseries of smaller length lead to a high variance of the R/S estimates.
Generalized exponent
The basic Hurst exponent can be related to the expected size of changes, as a function of the lag between observations, as measured by E(|Xt+τ−Xt|2). For the generalized form of the coefficient, the exponent here is replaced by a more general term, denoted by q.
There are a variety of techniques that exist for estimating H, however assessing the accuracy of the estimation can be a complicated issue. Mathematically, in one technique, the Hurst exponent can be estimated such that:[19][20] <math display="block">H_q = H(q),</math> for a time series <math display="block">g(t), t = 1, 2, \dots</math> may be defined by the scaling properties of its structure functions <math>S_q</math> (<math>\tau</math>): <math display="block">S_q = \left\langle \left|g(t + \tau) - g(t)\right|^q \right\rangle_t \sim \tau^{qH(q)}, </math> where <math>q > 0</math>, <math>\tau</math> is the time lag and averaging is over the time window <math display="block">t \gg \tau,</math> usually the largest time scale of the system.
Practically, in nature, there is no limit to time, and thus H is non-deterministic as it may only be estimated based on the observed data; e.g., the most dramatic daily move upwards ever seen in a stock market index can always be exceeded during some subsequent day.[21]
In the above mathematical estimation technique, the function Шаблон:Math contains information about averaged generalized volatilities at scale <math>\tau</math> (only Шаблон:Math are used to define the volatility). In particular, the Шаблон:Math exponent indicates persistent (Шаблон:Math) or antipersistent (Шаблон:Math) behavior of the trend.
For the BRW (brown noise, <math>1/f^2</math>) one gets <math display="block">H_q = \frac{1}{2},</math> and for pink noise (<math>1/f</math>) <math display="block">H_q = 0.</math>
The Hurst exponent for white noise is dimension dependent,[22] and for 1D and 2D it is <math display="block">H^{1D}_q = \frac{1}{2} , \quad H^{2D}_q = -1.</math>
For the popular Lévy stable processes and truncated Lévy processes with parameter α it has been found that
<math display="block">H_q = q/\alpha,</math> for <math>q < \alpha</math>, and <math>H_q = 1</math> for <math>q \geq \alpha</math>. Multifractal detrended fluctuation analysis[23] is one method to estimate <math>H(q)</math> from non-stationary time series. When <math>H(q)</math> is a non-linear function of q the time series is a multifractal system.
Note
In the above definition two separate requirements are mixed together as if they would be one.[24] Here are the two independent requirements: (i) stationarity of the increments, Шаблон:Math in distribution. This is the condition that yields longtime autocorrelations. (ii) Self-similarity of the stochastic process then yields variance scaling, but is not needed for longtime memory. E.g., both Markov processes (i.e., memory-free processes) and fractional Brownian motion scale at the level of 1-point densities (simple averages), but neither scales at the level of pair correlations or, correspondingly, the 2-point probability density.Шаблон:Clarify
An efficient market requires a martingale condition, and unless the variance is linear in the time this produces nonstationary increments, Шаблон:Math. Martingales are Markovian at the level of pair correlations, meaning that pair correlations cannot be used to beat a martingale market. Stationary increments with nonlinear variance, on the other hand, induce the longtime pair memory of fractional Brownian motion that would make the market beatable at the level of pair correlations. Such a market would necessarily be far from "efficient".
An analysis of economic time series by means of the Hurst exponent using rescaled range and Detrended fluctuation analysis is conducted by econophysicist A.F. Bariviera.[25] This paper studies the time varying character of Long-range dependency and, thus of informational efficiency.
Hurst exponent has also been applied to the investigation of long-range dependency in DNA,[26] and photonic band gap materials.[27]
See also
Implementations
- Matlab code for computing R/S, DFA, periodogram regression and wavelet estimates of the Hurst exponent and their corresponding confidence intervals is available from RePEc: https://ideas.repec.org/s/wuu/hscode.html
- Implementation of R/S in Python: https://github.com/Mottl/hurst and of DFA and MFDFA in Python: https://github.com/LRydin/MFDFA
- Matlab code for computing real Hurst and complex Hurst: https://www.mathworks.com/matlabcentral/fileexchange/49803-calculate-complex-hurst
- Excel sheet can also be used to do so: https://www.researchgate.net/publication/272792633_Excel_Hurst_Calculator
References
- ↑ 1,0 1,1 Шаблон:Cite journal
- ↑ Шаблон:Cite book
- ↑ 3,0 3,1 Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Torsten Kleinow (2002)Testing Continuous Time Models in Financial Markets, Doctoral thesis, Berlin Шаблон:Page needed
- ↑ 6,0 6,1 Шаблон:Cite conference
- ↑ 7,0 7,1 7,2 Шаблон:Cite book
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ J. Beran. Statistics For Long-Memory Processes. Chapman and Hall, 1994.
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ R. H. Riedi. Multifractal processes. In P. Doukhan, G. Oppenheim, and M. S. Taqqu, editors, The- ory And Applications Of Long-Range Dependence, pages 625–716. Birkh¨auser, 2003.
- ↑ Шаблон:Cite journal
- ↑ 17,0 17,1 Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Mandelbrot, Benoît B., The (Mis)Behavior of Markets, A Fractal View of Risk, Ruin and Reward (Basic Books, 2004), pp. 186-195
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Joseph L McCauley, Kevin E Bassler, and Gemunu H. Gunaratne (2008) "Martingales, Detrending Data, and the Efficient Market Hypothesis", Physica, A37, 202, Open access preprint: arXiv:0710.2583
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal
- ↑ Шаблон:Cite journal