## Tags : Bhattacharyya

Entries in this Tags : 3logs Showing : 1 - 3 / 3

## Apr 29, 2010

### The Burbea-Rao and Bhattacharyya centroids

Post @ 18:55:16 | Bhattacharyya

We study the centroid with respect to the class of information-theoretic distortion measures called Burbea-Rao divergences. Burbea-Rao divergences generalize the Jensen-Shannon divergence by measuring the non-negative Jensen difference induced by a strictly convex and differentiable function expressing a measure of entropy. We first show that a symmetrization of Bregman divergences called Jensen-Bregman distances yields a natural definition of Burbea-Rao divergences. We then define skew Burbea-Rao divergences, and prove that skew Burbea-Rao divergences amount to compute Bregman divergences in asymptotical cases. We prove that Burbea-Rao centroids are always unique, and we design a generic iterative algorithm for efficiently estimating those centroids with guaranteed convergence. In statistics, the Bhattacharyya distance is widely used to measure the degree of overlap of probability distributions. This distance notion is all the more useful as it provides both upper and lower bounds on Bayes misclassification error, and turns out to be equal at the infinitesimal level to Fisher information. We show that Bhattacharyya distances on members of the same exponential family amount to calculate a Burbea-Rao divergence. We thus get as a byproduct an efficient algorithm for computing the Bhattacharyya centroid of a set of parametric distributions belonging to the same exponential families, improving over former specialized methods that were mostly limited to univariate or "diagonal" multivariate Gaussians.

Here is the arxiv report

## Mar 02, 2010

### Bhattacharyya matrices and Cramer-Rao lower bounds

Post @ 11:20:14 | Bhattacharyya

Yesterday, I looked at historical papers of A. Bhattacharyya and found that he wrote three papers (one per year) as follows:

1946: On some analogues of the amount of information and their use in statistical estimation: I
1947: On some analogues of the amount of information and their use in statistical estimation: II
1948: On some analogues of the amount of information and their use in statistical estimation: III


The celebrated Cramer-Rao inequality can be seen as a particular case of Bhattacharyya inequality. (provide a lower bound on the variance of ANY unbiased estimator of a parametric function).

The Bhattacharyya bounds obtained from Bhattacharyya matrices become sharper viz. the order of the matrix.

And information geometry cherishes lower bounds too. A great result not so popular somehow. It deserves prime time nowadays.

Frank.

## Dec 19, 2009

### Bhattacharyya metric for multivariate gaussians

Post @ 21:20:21 | Bhattacharyya

The Bhattacharyya metric for multivariate gaussians is given by

The geometry is spherical on renormalized density functions.

Bhattacharyya with Mahalanobis distances were precursors of the Fisher-Rao Riemannian distances.

Frank.