Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

crossentropy

Cross Entropy


Description

KNN Cross Entropy Estimators.

Usage

crossentropy(X, Y, k=10, algorithm=c("kd_tree", "cover_tree", "brute"))

Arguments

X

an input data matrix.

Y

an input data matrix.

k

the maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Details

If p(x) and q(x) are two continuous probability density functions, then the cross-entropy of p and q is defined as H(p;q) = E_p[-\log q(x)].

Value

a vector of length k for crossentropy estimates using 1:k nearest neighbors, respectively.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.


FNN

Fast Nearest Neighbor Search Algorithms and Applications

v1.1.3
GPL (>= 2)
Authors
Alina Beygelzimer, Sham Kakadet and John Langford (cover tree library), Sunil Arya and David Mount (ANN library 1.1.2 for the kd-tree approach), Shengqiao Li
Initial release
2019-02-15

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.