Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

mutual_information

Mutual Information


Description

KNN Mutual Information Estimators.

Usage

mutinfo(X, Y, k=10, direct=TRUE)

Arguments

X

an input data matrix.

Y

an input data matrix.

k

the maximum number of nearest neighbors to search. The default value is set to 10.

direct

Directly compute or via entropies.

Details

The direct computation is based on the first estimator of A. Kraskov, H. Stogbauer and P.Grassberger (2004) and the indirect computation is done via entropy estimates, i.e., I(X, Y) = H (X) + H(Y) - H(X, Y). The direct method has smaller bias and variance but the indirect method is faster, see Evans (2008).

Value

For direct method, one mutual information estimate; For indirect method,a vector of length k for mutual information estimates using 1:k nearest neighbors, respectively.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com.

References

A. Kraskov, H. Stogbauer and P.Grassberger (2004). “Estimating mutual information”. Physical Review E, 69:066138, 1–16.

D. Evans (2008). “A Computationally efficient estimator for mutual information”. Proc. R. Soc. A, 464, 1203–1215.


FNN

Fast Nearest Neighbor Search Algorithms and Applications

v1.1.3
GPL (>= 2)
Authors
Alina Beygelzimer, Sham Kakadet and John Langford (cover tree library), Sunil Arya and David Mount (ANN library 1.1.2 for the kd-tree approach), Shengqiao Li
Initial release
2019-02-15

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.