Shannon's Mutual Information I(X,Y)
Compute Shannon's Mutual Information based on the identity I(X,Y) = H(X) + H(Y) - H(X,Y) based on a given joint-probability vector P(X,Y) and probability vectors P(X) and P(Y).
MI(x, y, xy, unit = "log2")
x |
a numeric probability vector P(X). |
y |
a numeric probability vector P(Y). |
xy |
a numeric joint-probability vector P(X,Y). |
unit |
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. |
This function might be useful to fastly compute Shannon's Mutual Information for any given joint-probability vector and probability vectors.
Shannon's Mutual Information in bit.
Hajk-Georg Drost
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
MI( x = 1:10/sum(1:10), y = 20:29/sum(20:29), xy = 1:10/sum(1:10) )
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.