## Tags : optimization

Entries in this Tags : 1logs Showing : 1 - 1 / 1

## Apr 26, 2012

### Exponentially many extrema!

Post @ 17:08:15 | optimization

When defining a centroid with respect to a divergence as the minimizer of the average distortion, one asks whether the minimizer is unique or not:

For f-divergences, convexity in both arguments ensure that it is the case, and setting the gradient to zero does the job. But when it is not convex, we may still hope (and prove) that the centroid is unique. This was the case for the Burbea-Rao centroid (or Jensen centroid induced by the Jensen convexity gap of a function):

The Burbea-Rao and Bhattacharyya centroids using a fixed-point equation and property of interness of quasi-arithmetic means.

However, in general the centroid defined as a minimizer may not be unique if the average distortion has many local minima. Such simple functions may indeed yield exponentially many maxima as shown by Auer and his colleagues:

They proved that a single neuron with the logistic transfer function and square loss to optimze can yield exponential many extrema for learning the weight vector.

@inproceedings{ManyLocalMinima-1995,
author    = {Peter Auer and Mark Herbster and  Manfred K. Warmuth},
title     = {Exponentially many local minima for single neurons},
booktitle = {Neural Information Processing Society (NIPS)},
year      = {1995},
pages     = {316-322}
}