Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

JE

Shannon's Joint-Entropy H(X,Y)


Description

This funciton computes Shannon's Joint-Entropy H(X,Y) = - ∑ ∑ P(X,Y) * log2(P(X,Y)) based on a given joint-probability vector P(X,Y).

Usage

JE(x, unit = "log2")

Arguments

x

a numeric joint-probability vector P(X,Y) for which Shannon's Joint-Entropy H(X,Y) shall be computed.

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Value

a numeric value representing Shannon's Joint-Entropy in bit.

Author(s)

Hajk-Georg Drost

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

Examples

JE(1:100/sum(1:100))

philentropy

Similarity and Distance Quantification Between Probability Functions

v0.4.0
GPL-2
Authors
Hajk-Georg Drost [aut, cre] (<https://orcid.org/0000-0002-1567-306X>)
Initial release
2020-01-09

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.