Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

entropy

Estimation of Entropy, Mutual Information and Related Quantities

Implements various estimators of entropy for discrete random variables, including the shrinkage estimator by Hausser and Strimmer (2009), the maximum likelihood and the Millow-Madow estimator, various Bayesian estimators, and the Chao-Shen estimator. It also offers an R interface to the NSB estimator. Furthermore, the package provides functions for estimating the Kullback-Leibler divergence, the chi-squared divergence, mutual information, and the chi-squared divergence of independence. It also computes the G statistic and the chi-squared statistic and corresponding p-values. Furthermore, there are functions for discretizing continuous random variables.

Functions (14)

entropy

Estimation of Entropy, Mutual Information and Related Quantities

v1.3.0
GPL (>= 3)
Authors
Jean Hausser and Korbinian Strimmer
Initial release
2021-04-25

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.