Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

miData

Empirical Estimation of the Entropy from a Table of Counts


Description

This function empirically estimates the Mutual Information from a table of counts using the observed frequencies.

Usage

miData(freqs.table, method = c("mi.raw", "mi.raw.pc"))

Arguments

freqs.table

a table of counts.

method

a character determining if the Mutual Information should be normalized.

Details

The mutual information estimation is computed from the observed frequencies through a plugin estimator based on entropy.

The plugin estimator is I(X, Y) = H (X) + H(Y) - H(X, Y), where H() is the entropy computed with entropyData.

Value

Mutual information estimate.

Author(s)

Gilles Kratzer

References

Cover, Thomas M, and Joy A Thomas. (2012). "Elements of Information Theory". John Wiley & Sons.

See Also

Examples

## Generate random variable
Y <- rnorm(n = 100, mean = 0, sd = 2)
X <- rnorm(n = 100, mean = 5, sd = 2)

dist <- list(Y="gaussian", X="gaussian")

miData(discretization(data.df = cbind(X,Y), data.dists = dist,
                discretization.method = "fd", nb.states = FALSE),
       method = "mi.raw")

abn

Modelling Multivariate Data with Additive Bayesian Networks

v2.5-0
GPL (>= 2)
Authors
Gilles Kratzer [aut, cre] (<https://orcid.org/0000-0002-5929-8935>), Fraser Iain Lewis [aut] (<https://orcid.org/0000-0003-4580-2712>), Reinhard Furrer [ctb] (<https://orcid.org/0000-0002-6319-2332>), Marta Pittavino [ctb] (<https://orcid.org/0000-0002-1232-1034>)
Initial release
2021-04-21

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.