Shannon entropy is said additive in the sense that the entropy of the joint distribution H(X*Y) is the sum of the entropies: H(X*Y) = H(X)+H(Y). This property is not true for the quadratic entropy (sum of squares). The Java program
Taxonomy of principal distances
How do we visualize relationships in the jungle of (statistical) distances? I tried to give insights at a glance with this poster.
Convex Hull Peeling
A long time ago, well in 1996, I investigated output-sensitive algorithms. I then designed an algorithm for peeling iteratively the convex hulls of a 2D point set. This yields the notion of depth of a point set, and is a
The geometric median
The center of mass (=centroid) is defined as the center point minimizing the squared of the Euclidean distances (=variance). If one of the source point is an outlier corrupting your dataset, and if that outlier goes to infinity, then your
Natural Exponential Families QVF
They are only 6 exponential family distributions that admit variance as a quadratic function (QVF=quadratic variance function) of the parameter. For the multivariate case, it is a bit more complex but well defined and studied: The $2d+4$ simple quadratic natural