Shannon's Joint-Entropy H(X,Y)
This funciton computes Shannon's Joint-Entropy H(X,Y) = - ∑ ∑ P(X,Y) * log2(P(X,Y)) based on a given joint-probability vector P(X,Y).
JE(x, unit = "log2")
x |
a numeric joint-probability vector P(X,Y) for which Shannon's Joint-Entropy H(X,Y) shall be computed. |
unit |
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. |
a numeric value representing Shannon's Joint-Entropy in bit.
Hajk-Georg Drost
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
JE(1:100/sum(1:100))
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.