Kullback-Leibler divergence (relative entropy)
Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().
kl.divergence(object, eps = 10^-4, overlap = TRUE)
object |
Matrix or dataframe object with >=2 columns |
eps |
Probabilities below this threshold are replaced by this threshold for numerical stability. |
overlap |
Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps. |
pairwise Kullback-Leibler divergence index (matrix)
Jeffrey S. Evans <jeffrey_evans@tnc.org>
Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86
x <- seq(-3, 3, length=200) y <- cbind(n=dnorm(x), t=dt(x, df=10)) matplot(x, y, type='l') kl.divergence(y) # extract value for last column kl.divergence(y[,1:2])[3:3]
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.