Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

entropy.plugin

Plug-In Entropy Estimator


Description

entropy.plugin computes the Shannon entropy H of a discrete random variable with the specified frequencies (probability mass function).

Usage

entropy.plugin(freqs, unit=c("log", "log2", "log10"))

Arguments

freqs

frequencies (probability mass function).

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Shannon entropy of a discrete random variable is defined as H = -∑_k p(k) \log( p(k) ), where p is its probability mass function.

Value

entropy.plugin returns the Shannon entropy.

Author(s)

Korbinian Strimmer (http://www.strimmerlab.org).

See Also

Examples

# load entropy library 
library("entropy")

# some frequencies
freqs = c(0.2, 0.1, 0.15, 0.05, 0, 0.3, 0.2)  

# and corresponding entropy
entropy.plugin(freqs)

entropy

Estimation of Entropy, Mutual Information and Related Quantities

v1.3.0
GPL (>= 3)
Authors
Jean Hausser and Korbinian Strimmer
Initial release
2021-04-25

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.