Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

H

Shannon's Entropy H(X)


Description

Compute the Shannon's Entropy H(X) = - ∑ P(X) * log2(P(X)) based on a given probability vector P(X).

Usage

H(x, unit = "log2")

Arguments

x

a numeric probability vector P(X) for which Shannon's Entropy H(X) shall be computed.

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Details

This function might be useful to fastly compute Shannon's Entropy for any given probability vector.

Value

a numeric value representing Shannon's Entropy in bit.

Author(s)

Hajk-Georg Drost

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

JE, CE, KL, JSD, gJSD

Examples

H(1:10/sum(1:10))

philentropy

Similarity and Distance Quantification Between Probability Functions

v0.4.0
GPL-2
Authors
Hajk-Georg Drost [aut, cre] (<https://orcid.org/0000-0002-1567-306X>)
Initial release
2020-01-09

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.