Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

entropy.MillerMadow

Miller-Madow Entropy Estimator


Description

entropy.MillerMadow estimates the Shannon entropy H of the random variable Y from the corresponding observed counts y using the Miller-Madow correction to the empirical entropy).

Usage

entropy.MillerMadow(y, unit=c("log", "log2", "log10"))

Arguments

y

vector of counts.

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Miller-Madow entropy estimator (1955) is the bias-corrected empirical entropy estimate.

Note that the Miller-Madow estimator is not a plug-in estimator, hence there are no explicit underlying bin frequencies.

Value

entropy.MillerMadow returns an estimate of the Shannon entropy.

Author(s)

Korbinian Strimmer (http://www.strimmerlab.org).

References

Miller, G. 1955. Note on the bias of information estimates. Info. Theory Psychol. Prob. Methods II-B:95-100.

See Also

Examples

# load entropy library 
library("entropy")

# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)  

# estimate entropy using Miller-Madow method
entropy.MillerMadow(y)

# compare to empirical estimate
entropy.empirical(y)

entropy

Estimation of Entropy, Mutual Information and Related Quantities

v1.3.0
GPL (>= 3)
Authors
Jean Hausser and Korbinian Strimmer
Initial release
2021-04-25

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.