Spectral entropy of a time series
Computes spectral entropy from a univariate normalized spectral density, estimated using an AR model.
entropy(x)
x |
a univariate time series |
The spectral entropy equals the Shannon entropy of the spectral density f_x(λ) of a stationary process x_t:
H_s(x_t) = - \int_{-π}^{π} f_x(λ) \log f_x(λ) d λ,
where the density is normalized such that
\int_{-π}^{π} f_x(λ) d λ = 1.
An estimate of f(λ) can be obtained using spec.ar
with
the burg
method.
A non-negative real value for the spectral entropy H_s(x_t).
Rob J Hyndman
Jerry D. Gibson and Jaewoo Jung (2006). “The Interpretation of Spectral Entropy Based Upon Rate Distortion Functions”. IEEE International Symposium on Information Theory, pp. 277-281.
Goerg, G. M. (2013). “Forecastable Component Analysis”. Journal of Machine Learning Research (JMLR) W&CP 28 (2): 64-72, 2013. Available at http://jmlr.org/proceedings/papers/v28/goerg13.html.
entropy(rnorm(1000)) entropy(lynx) entropy(sin(1:20))
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.