Kullback.Leibler {truecluster}R Documentation

Multivariate Informationtheoretic Measures

Description

These functions calculate mulivariate informationtheoretic measures: Kullback Leibler divergence.

Usage

Kullback.Leibler(p, basis = 2)

Arguments

p matrix, each column contains probabilities of one distribution across the same alphabet (rows)
basis basis of the log used (default=2)

Details

D_KL(P||Q) = sum(P*log(P/Q))

Value

A square divergence matrix,

Totals always returned: a list with components H joint entropy, Ha row entropy, Hab row conditional entropy given columns, Hb column entropy, Hba column conditional entropy given rows, Im mutual information
Margins returned unless grain="Total": pa row probabilities , ha row entropie, hab colwise conditional entropies, pb column probabilities, hb column entropy , hba rowwise conditional entropies
Cells returned if grain="Cells": p joint probabilities, pab columnwise conditional probabilities, pba rowwise conditional probabilities, h joint entropies, hab columnwise conditional entropies, hba rowwise conditional entropies

Author(s)

Jens Oehlschlägel

References

MacKay, David J.C. (2003). Information Theory, Inference, and Learning Algorithms (chapter 8). Cambridge University Press.

See Also

shannon.information, dist.entropy, Kullback.Leibler, log

Examples

 x <- seq(-3, 3, 0.1)
 cp <- pnorm(x)
 p <- cp[-1] - cp[-length(cp)]
 cq <- punif(x, -3, 3)
 q <- cq[-1] - cq[-length(cq)]
 Kullback.Leibler(cbind(p,q))

[Package truecluster version 0.3 Index]