Tags : f-divergence

Entries in this Tags : 4logs Showing : 1 - 4 / 4

Jun 09, 2010

An asymptotic property of centroids with respect to f-divergences

Post @ 4:39:49 | f-divergence

f-centroids can be defined as the unique minimizer of the average Csiszar/Ali-Silvey f-divergence. All f-centroids are homogeneous (scale free, homogeneous degree 1).

f-centroids include power means (also reachable as Bregman centroids), extreme means, Lehmer mean, Gini mean, etc

An interesting property is by shifting the origin and letting the origin tend to infinity, all f-means become arithmetic mean.

The proof requires f to be C3, three times continously differentiable.

Frank.

Jun 04, 2010

Joyce and danger of Internet

Post @ 17:48:15 | f-divergence

We all daily use Internet and are biased by its non-controlled source of information. I was looking for the full name of the Ali and Silvey f-divergence paper, also independently studied by Csiszar. In those days, it was not rare to only mention the last name and give only initials, so it is hard to know for sure what those capital letters mean in

Ali, S. M.; Silvey, S. D. (1966). "A general class of coefficients of divergence of one distribution from another". Journal of the Royal Statistical Society, Series B 28 (1): 131?140.

After using several search tools (google x, x in plain, books, scholar, etc.), and tracing back previous papers, eventually writing some emails to Aligarh Muslim University, I figured out the full names of the celebrated paper, and finally completed by bibtex entry as follows:

% Computational information geometry bibtex file
@Article{AliSilvey-1966,
author =       {Syed Mumtaz Ali and Samuel David Silvey},
title =        {A general class of coefficients of divergence of one distribution from another},
journal =      {Journal of the Royal Statistical Society, Series B},
year =         {1966},
volume =   {28},
pages =    {131--142}
}


S. M. Ali was a PhD student of S. D. Silvey.

On a different side of Internet, I searched for some Kullback-Leibler properties, and figured out a weird french article mentionning

Kilolitre de divergence et mise Ã  jour bayÃ©sienne


In fact, it was the automated translation for

KL divergence and Bayesian updating


KL has been suggested by the automatic translator to be kiloliter... Then the translated article becomes really funny to read. Have a look http://www.worldlingo.com/ma/enwiki/fr/Kullback%E2%80%93Leibler_divergence (french) and http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence (english)

Frank.

Jan 07, 2010

Variational distance is the only metric f-divergence

Post @ 22:40:41 | f-divergence

A quite interesting property is that the variational divergence (a f-divergence) and its multiples are the only metric f-divergences. This was recently shown
Confliction of the Convexity and Metric Properties in f-Divergences (IEICE 2007).

If instead, we assume f strictly positive concave (and self-dual f(x)=x f(1/x)) then the formula of f-divergences yields metrics.

Frank.

Feb 26, 2009

Information monotonicity

Post @ 18:10:05 | f-divergence

Csiszar f-divergences have the property of information monotonicity. So take a positive array p with n bins and partition it into m bins P with the probability of falling in a bin being the sum of the probability of the atoms of that bin. Then the f-divergence of (p,q) should always be greater or equal to the f-divergence of the corresponding partitionned arrays. In other words:

f-div(p,q) >= f-div(P,Q)

That means that we can only loose information by aggregating atoms. That is one fairly reasonnable behavior of information.

More axiomatic characterization is detailed in:

CsiszÃ¡r, Imre. 2008. "Axiomatic Characterizations of Information Measures." Entropy 10, no. 3: 261-273.

Frank.