Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

MI

Shannon's Mutual Information I(X,Y)


Description

Compute Shannon's Mutual Information based on the identity I(X,Y) = H(X) + H(Y) - H(X,Y) based on a given joint-probability vector P(X,Y) and probability vectors P(X) and P(Y).

Usage

MI(x, y, xy, unit = "log2")

Arguments

x

a numeric probability vector P(X).

y

a numeric probability vector P(Y).

xy

a numeric joint-probability vector P(X,Y).

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Details

This function might be useful to fastly compute Shannon's Mutual Information for any given joint-probability vector and probability vectors.

Value

Shannon's Mutual Information in bit.

Author(s)

Hajk-Georg Drost

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

H, JE, CE

Examples

MI( x = 1:10/sum(1:10), y = 20:29/sum(20:29), xy = 1:10/sum(1:10) )

philentropy

Similarity and Distance Quantification Between Probability Functions

v0.4.0
GPL-2
Authors
Hajk-Georg Drost [aut, cre] (<https://orcid.org/0000-0002-1567-306X>)
Initial release
2020-01-09

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.