## May 18, 2011

### Renyi and Tsallis entropies and divergences for exponential families

It is well-known that the Kullback-Leibler divergence of two densities belonging to the same exponential family can be equivalently computed as the Bregman divergence on natural parameters [for the log-normalizer].
So what happens if we consider generelizations of Kullback-Leibler divergence? KL divergence is based on Shannon entropy, so let us look at two generalizations of Shannon entropy: Renyi [preserve additivity] and Tsallis [non-extensive]. It turns out that we have simple closed-form expressions for the relative Renyi/Tsallis entropies of densities belonging to the same exponential family. Moreover, when we consider only Tsallis/Renyi entropies, we end up with simple formula that yields closed-form formula for families with standard carrier measure. We illustrate these results by computing the Tsallis/Renyi entropies/divergences for multivariate normals.

The technical details can be found here.

Frank.