Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

KL.divergence

Kullback-Leibler Divergence


Description

Compute Kullback-Leibler divergence.

Usage

KL.divergence(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.divergence(X, Y, k = 10, algorithm="kd_tree")

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Details

If p(x) and q(x) are two continuous probability density functions, then the Kullback-Leibler divergence of q from p is defined as E_p[log p(x)/q(x)].

KL.* versions return divergences from C code to R but KLx.* do not.

Value

Return the Kullback-Leibler divergence from X to Y.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.

See Also

Examples

set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)

    KL.divergence(X, Y, k=5)
    #theoretical divergence = log(0.2/0.4)+(0.4-0.2)-1 = 1-log(2) = 0.307

FNN

Fast Nearest Neighbor Search Algorithms and Applications

v1.1.3
GPL (>= 2)
Authors
Alina Beygelzimer, Sham Kakadet and John Langford (cover tree library), Sunil Arya and David Mount (ANN library 1.1.2 for the kd-tree approach), Shengqiao Li
Initial release
2019-02-15

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.