The Jensen-Bregman divergence is a distortion measure defined by the Jensen difference provided by a strictly convex function. Jensen-Bregman divergences extend the well-known Jensen-Shannon divergence by allowing to choose an arbitrary convex function generator instead of the standard Shannon entropy. This class of Jensen-Bregman divergences notably includes the squared Euclidean distance. Although Jensen-Bregman divergences are symmetric distances by construction, they are not metrics as they violate the triangle inequality. We study the geometric properties and combinatorial complexities of both the Voronoi diagrams and the centroidal Voronoi diagrams induced by such as class of information-theoretic divergences. We show that those Jensen- Bregman divergences appear naturally into two contexts: (1) when symmetrizing Bregman divergences, and (2) when computing the Bhattacharyya distances of statistical distributions. The Bhattacharyya distance measures the amount of overlap of distributions, and is popularly used to provide both lower and upper bounds in machine learning: the Bayes? misclassification error. Since the Bhattacharyya distance of popular distributions in statistics called the exponential families (including familiar Gaussian, Poisson, multinomial, Beta/Gamma families, etc.) can be computed equivalently as Jensen-Bregman divergences, the Jensen-Bregman Voronoi diagrams allow one also to study statistical Voronoi diagrams induced by an entropic function.