## Tags : KL

Entries in this Tags : 1logs Showing : 1 - 1 / 1

## Oct 18, 2007

### Kullback-Leibler and Itakura-Saito

Post @ 10:27:54 | KL, IS

p> Today, I would like to present an old (1991) paper that approximates in the limit case the symetric KL by the symmetric IS, better known in sound processing as COSH distance. Nowadays, everything has been generalized using the family of Bregman-Csiszar divergences. Nevertheless, this "historical" paper points out some interesting properties when it comes to deal with multivariate Gaussian distributions/

The paper in question is A Computationally Compact Divergence Measure for Speech Processing http://www.citeulike.org/user/ciga/article/1782055 It mentions the now famous KL formula for multivariate Gaussians:
where M denotes the dimension and C the variance-covariance symmetric positive definite matrices. Note that this is an oriented (directed) divergence, not symmetric.
The paper shows that for autoregressive models the formula bears the same aspect by plugging the autocorrelation matrices instead of the variance-covariance matrices.
Now the most important property revealed in this work is the limit case when M tends to infinity. It shows that KL(f1||f2)=1/2 IS(f1||f2) where f1,f2 are denote the spectra. Thus as M->infinity, the oriented KL divergence may be interpreted by the simpler Itakura-Saito divergence formula for the case of Gaussian distributions (autoregressive processes). Similarly, the symmetric KL is half of the symmetric Itakura-Saito div in that case: sKL=0.5 sIS.

Section 5 then consideres the computation of the centroid, but this is another story related to my recent research for another post!