The Voronoi diagram of a finite set of objects is a fundamental geometric structure that subdivides the embedding space into regions, each region consisting of the points that are closer to a given object than to the others. We may define many variants of Voronoi diagrams depending on the class of objects, the distance functions and the embedding space. In this paper, we investigate a framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences. Bregman divergences include not only the traditional (squared) Euclidean distance but also various divergence measures based on entropic functions. Accordingly, Bregman Voronoi diagrams allow to define information-theoretic Voronoi diagrams in statistical parametric spaces based on the relative entropy of distributions. We define several types of Bregman diagrams, establish correspondences between those diagrams (using the Legendre transformation), and show how to compute them efficiently. We also introduce extensions of these diagrams, e.g. k-order and k-bag Bregman Voronoi diagrams, and introduce Bregman triangulations of a set of points and their connexion with Bregman Voronoi diagrams. We show that these triangulations capture many of the properties of the celebrated Delaunay triangulation. Finally, we give some applications of Bregman Voronoi diagrams which are of interest in the context of computational geometry and machine learning.
Categories and Subject Descriptors: I.3.5 [Computer Graphics] Computational Geometry and Object Modeling -Geometric algorithms, languages, and systems; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems -Geometrical problems and computations; G.2.1 [Discrete Mathematics]: Combinatorics.
The Voronoi diagram vor(S) of a set of n points S = {p 1 , …, p n } of the d-dimensional Euclidean space R d is defined as the cell complex whose d-cells are the Voronoi regions {vor(p i )} i∈{1,..,n} where vor(p i ) is the set of points of R d closer to p i than to any other point of S with respect to a distance function δ:
Points {p i } i are called the Voronoi sites or Voronoi generators. Since its inception in disguise by Descartes in the 17th century [5], Voronoi diagrams have found a broad spectrum of applications in science. Computational geometers have focused at first on Euclidean Voronoi diagrams [5] by considering the case where δ(x, y) is the Euclidean distance ||x -y|| = d i=1 (x i -y i ) 2 . Voronoi diagrams have been later on defined and studied for other distance functions, most notably the L 1 distance ||x -y|| 1 = d i=1 |x i -y i | (Manhattan distance) and the L ∞ distance ||x -y|| ∞ = max i∈{1,…,d} |x i -y i | [10,5]. Klein further presented an abstract framework for describing and computing the fundamental structures of abstract Voronoi diagrams [26,11].
In artificial intelligence, machine learning techniques also rely on geometric concepts for building classifiers in supervised problems (e.g., linear separators, oblique decision trees, etc.) or clustering data in unsupervised settings (e.g., k-means, support vector clustering [2], etc.). However, the considered data sets S and their underlying spaces X are usually not metric spaces. The notion of distance between two elements of X needs to be replaced by a pseudo-distance that is not necessarily symmetric and may not satisfy the triangle inequality. Such a pseudo-distance is also referred to as distortion, (dis)similarity or divergence in the literature. For example, in parametric statistical spaces X , a vector point represent a distribution and its coordinates store the parameters of the associated distribution. A notion of “distance” between two such points is then needed to represent the divergence between the corresponding distributions.
Very few works have tackled an in-depth study of Voronoi diagrams and their applications for such a kind of statistical spaces. This is all the more important even for ordinary Voronoi diagrams as Euclidean point location of sites are usually observed in noisy environments (e.g., imprecise point measures in computer vision experiments), and “noise” is often modeled by means of Normal distributions (so-called “Gaussian noise”). To the best of our knowledge, statistical Voronoi diagrams have only been considered in a 4-page short paper of Onishi and Imai [34] which relies on Kullback-Leibler divergence of dD multivariate normal distributions to study combinatorics of their Voronoi diagrams, and subsequently in a 2-page video paper of Sadakane et al. [40] which defines the divergence implied by a convex function and its conjugate, and present the Voronoi diagram with flavors of information geometry [1] (see also [35] and related short communications [25,24]). Our study of Bregman Voronoi diagrams generalizes and subsumes these preliminary studies using an easier concept of divergence: Bregman divergences [12,6] that do not rely explicitly on convex conjugates. Bregman divergences encapsulate the squared Euclidean distance and many widely used divergences, e.g. the Kullback-Leibler divergence. It should be noticed however that other divergences have been defined and studied in the context of Riemannian geometry [1]. Sacrifying for some generality, while not very restrictive in practice, allows a much simpler treatment and our study of Bregman divergences is elementary and does not rely on Riemannian geometry.
In this paper, we give a thorough treatment of Bregman Voronoi diagrams which elegantly unifies the ordinary Euclidean Voronoi diagram and statistical Voronoi diagrams. Our contributions are summarized as follows:
• Since Bregman divergences are not symmetric, we define two types of Bregman Voronoi diagrams. One is an affine diagram with convex polyhedral cells while the other one is curved. The cells of those two diagrams are in 1-1 correspondence through the Legendre transformation. We also introduce a third-type symmetrized Bregman Voronoi diagram.
• We present a simple way to compute the Bregman Voronoi diagram of a set of points by lifting the points in a higher dimensional space using an extra dimension. This mapping leads also to combinatorial bounds on the size of these diagrams. We also define weighted Bregman Voronoi diagrams and show that the class of these diagrams is identical to the class of affine (or power) diagrams. Special cases of weighted Bregm
This content is AI-processed based on open access ArXiv data.