Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

kld

Kullback-Leibler Divergence


Description

Estimates the Kullback-Leibler Divergence which measures how one probability distribution diverges from the original distribution (equivalent means are assumed) Matrices must be positive definite inverse covariance matrix for accurate measurement. This is a relative metric

Usage

kld(base, test)

Arguments

base

Full or base model

test

Reduced or testing model

Value

A value greater than 0. Smaller values suggest the probability distribution of the reduced model is near the full model

Author(s)

Alexander Christensen <alexpaulchristensen@gmail.com>

References

Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22, 79-86. doi: 10.1214/aoms/1177729694

Examples

A1 <- solve(cov(neoOpen))

## Not run: 
A2 <- LoGo(neoOpen)

kld_value <- kld(A1, A2)

## End(Not run)

NetworkToolbox

Methods and Measures for Brain, Cognitive, and Psychometric Network Analysis

v1.4.1
GPL (>= 3.0)
Authors
Alexander Christensen [aut, cre] (<https://orcid.org/0000-0002-9798-7037>), Guido Previde Massara [ctb] (<https://orcid.org/0000-0003-0502-2789>)
Initial release
2020-12-07

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.