## Tags : Convex analysis

Entries in this Tags : 1logs Showing : 1 - 1 / 1

## Sep 17, 2010

### F-divergences as Jensen gaps

Post @ 18:58:27 | Convex analysis

A short post to test Mathjax in action.

Let us start with Jensen inequality for a random variable and a convex function:

$$F(E[X]) \leq E[F(X)].$$

Jensen inequality allows one to define statistical divergences by measuring the gap:

$$\Delta_F(X)= E[F(X)]-F(E[X]) \geq 0$$

The expectation can be taken according to another random variable Y:

$$\Delta_F(X:Y)= E_Y[F(X)]-F(E_Y[X]) \geq 0$$

Consider F-divergences for a convex function F satisfying F(1)=0 defined as:

$$I_F(p:q)=\int q(x) F\left(\frac{p(x)}{q(x)}\right) dx.$$

We can write

\begin{aligned} I_F(p:q) = E_q\left[F\left(\frac{p}{q}\right)\right] &\geq & F\left(E_q\left[\frac{p}{q}\right]\right), \cr & \geq & F(1)=0 \end{aligned}

Choosing $F(x)=-\log x$, we obtain the Kullback-Leibler divergence. The inequality is referred to as Gibb inequality.

Note that F-divergences can be interpreted as Jensen gaps: \begin{align} I_F(p:q) & = & E_q\left[F\left(\frac{p}{q}\right)\right] - F\left(E_q\left[\frac{p}{q}\right]\right) \geq 0 \cr & = & \Delta_{F}(\frac{P}{Q}:Q) \end{align}

Ali-Silvey-Csiszar divergences can therefore be interpreted as Jensen gaps on the probability ratio.

Frank.