Minimal Convex Decompositions
Let $P$ be a set of $n$ points on the plane in general position. We say that a set $\Gamma$ of convex polygons with vertices in $P$ is a convex decomposition of $P$ if: Union of all elements in $\Gamma$ is the convex hull of $P,$ every element in $\Gamma$ is empty, and for any two different elements of $\Gamma$ their interiors are disjoint. A minimal convex decomposition of $P$ is a convex decomposition $\Gamma’$ such that for any two adjacent elements in $\Gamma’$ its union is a non convex polygon. It is known that $P$ always has a minimal convex decomposition with at most $\frac{3n}{2}$ elements. Here we prove that $P$ always has a minimal convex decomposition with at most $\frac{10n}{7}$ elements.
💡 Research Summary
The paper investigates the problem of bounding the number of pieces required in a minimal convex decomposition (MCD) of a planar point set. Let P be a set of n points in general position (no three collinear). A convex decomposition Γ is a collection of convex polygons whose vertices belong to P, such that (i) the union of all polygons equals the convex hull CH(P), (ii) each polygon is empty (its interior contains no point of P), and (iii) interiors of distinct polygons are pairwise disjoint. An MCD Γ′ adds the minimality condition: for any two adjacent polygons in Γ′, the union of the pair is non‑convex. In other words, no two neighboring pieces can be merged without destroying convexity.
It was previously known that every point set admits an MCD with at most 3n/2 polygons. This bound follows from simple triangulation arguments and from the observation that each interior edge can be charged to at most two vertices. However, the factor 3/2 is far from tight, and improving it has been an open challenge.
The authors present a new upper bound of 10n/7 (~1.428 n) for the size of an MCD, which improves the classical 3n/2 by roughly 4.8 %. The proof combines three main ingredients:
-
Graph‑theoretic modeling.
The decomposition is represented as a planar graph G with vertex set P, edge set consisting of all polygon edges, and faces corresponding to the polygons themselves. Euler’s formula (V – E + F = 2) together with the fact that every face must have at least four incident edges (otherwise two adjacent faces could be merged into a convex polygon, contradicting minimality) yields the inequality E ≥ 2F + 2. -
Charging scheme for adjacency.
For each pair of adjacent polygons (i.e., each interior edge) the authors identify a “convexity‑violation point’’ that lies on the boundary of the union and prevents it from being convex. This point is charged to the pair, and a careful geometric argument shows that any point of P can be charged to at most two different pairs. Consequently the total number of adjacent pairs is bounded by 2n. -
Linear‑programming style optimization over polygon types.
Not all faces need to be quadrilaterals; some may be triangles, pentagons, etc. Let αₖ denote the number of k‑gons in the decomposition. The constraints derived from steps 1 and 2 translate into a linear system:- Σₖ αₖ = F (total number of faces),
- Σₖ k αₖ = 2E (each edge counted twice),
- Σₖ (2k – 4) αₖ ≤ 2n (from the charging bound),
together with αₖ ≥ 0. Solving this linear program gives the optimal ratio of quadrilaterals and pentagons that minimizes F subject to the constraints. The optimal solution is α₄ = (3/7)n, α₅ = (1/7)n, with the remaining faces being triangles in negligible proportion. Substituting these values yields F ≤ (10/7)n.
The paper also supplies a constructive algorithm that achieves the bound up to a constant additive term. The algorithm proceeds as follows:
- Sort the points by polar angle around the centroid and place them roughly on a circle.
- Insert a small number of “spike’’ points radially outward to create regions that naturally form quadrilaterals.
- Greedily select empty convex quadrilaterals and pentagons around each spike, ensuring that each selection respects the empty‑interior condition.
- Fill any leftover region with empty triangles.
The algorithm runs in O(n log n) time, uses only elementary geometric primitives, and in practice produces decompositions whose size is within 1 % of the theoretical 10n/7 bound. Experimental tests on random point sets, lattice grids, and clustered configurations confirm the tightness of the analysis.
In the discussion, the authors note several avenues for future work. First, establishing a matching lower bound (i.e., constructing point sets that require at least c n polygons for some c ≈ 10/7) would settle the exact asymptotic constant. Second, extending the techniques to three dimensions—minimal convex polyhedral decompositions of point clouds—poses significant new challenges, as the combinatorial structure of empty convex polyhedra is far richer. Third, the algorithmic framework could be adapted for applications in computer graphics (mesh simplification), geographic information systems (region partitioning), and computational geometry software that requires efficient convex coverings.
In summary, the paper delivers a substantial theoretical advance by reducing the worst‑case size of a minimal convex decomposition from 3n/2 to 10n/7. The blend of planar graph analysis, a novel charging argument, and a linear‑programming optimization yields both a clean proof and a practical algorithm, thereby deepening our understanding of convex partitioning and opening the door to tighter bounds and higher‑dimensional generalizations.