Plug-In Estimator of Mutual Information and of the Chi-Squared Statistic of Independence
mi.plugin
computes the mutual information
of two discrete random variables from the specified joint probability mass function.
chi2indep.plugin
computes the chi-squared divergence of independence.
mi.plugin(freqs2d, unit=c("log", "log2", "log10")) chi2indep.plugin(freqs2d, unit=c("log", "log2", "log10"))
freqs2d |
matrix of joint bin frequencies (joint probability mass function). |
unit |
the unit in which entropy is measured.
The default is "nats" (natural units). For
computing entropy in "bits" set |
The mutual information of two random variables X and Y is the Kullback-Leibler divergence between the joint density/probability mass function and the product independence density of the marginals.
It can also defined using entropy as MI = H(X) + H(Y) - H(X, Y).
Similarly, the chi-squared divergence of independence is the chi-squared divergence between the joint density and the product density. It is a second-order approximation of twice the mutual information.
mi.plugin
returns the mutual information.
chi2indep.plugin
returns the chi-squared divergence of independence.
Korbinian Strimmer (http://www.strimmerlab.org).
# load entropy library library("entropy") # joint distribution of two discrete variables freqs2d = rbind( c(0.2, 0.1, 0.15), c(0.1, 0.2, 0.25) ) # corresponding mutual information mi.plugin(freqs2d) # MI computed via entropy H1 = entropy.plugin(rowSums(freqs2d)) H2 = entropy.plugin(colSums(freqs2d)) H12 = entropy.plugin(freqs2d) H1+H2-H12 # and corresponding (half) chi-squared divergence of independence 0.5*chi2indep.plugin(freqs2d)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.