최대 손실 목표 비중심 군집에서 핵심 안정성의 한계와 불가능성

Reading time: 6 minute
...

📝 Original Info

  • Title: 최대 손실 목표 비중심 군집에서 핵심 안정성의 한계와 불가능성
  • ArXiv ID: 2511.19107
  • Date: 2025-11-25
  • Authors: ** - Eva Michelle (주 저자) - 공동 저자: 논문에 명시된 다른 연구자들 (예: Caragiannis, Micha, Shah 등은 선행 연구 인용) **

📝 Abstract

We study core stability in non-centroid clustering under the max-loss objective, where each agent's loss is the maximum distance to other members of their cluster. We prove that for all $k\geq 3$ there exist metric instances with $n\ge 9$ agents, with $n$ divisible by $k$, for which no clustering lies in the $\alpha$-core for any $\alpha<2^{\frac{1}{5}}\sim 1.148$. The bound is tight for our construction. Using a computer-aided proof, we also identify a two-dimensional Euclidean point set whose associated lower bound is slightly smaller than that of our general construction. This is, to our knowledge, the first impossibility result showing that the core can be empty in non-centroid clustering under the max-loss objective.

💡 Deep Analysis

Figure 1

📄 Full Content

Clustering is a fundamental task in data analysis, optimization, and beyond: we are given n points (or agents) with the goal to partition these points into k groups, the so-called clusters. Most works in clustering focus on the centroid model. In this model, each cluster has a representative center. For a given point, the loss of this point is its distance to the nearest center or equivalently to the center assigned to its partition. Recent work in clustering has introduced and studied many variations of proportional fairness [Chen, Fain, Lyu, and Munagala, 2019, Micha and Shah, 2020, Aziz, Lee, Chu, and Vollen, 2024, Kellerhals and Peters, 2024], where a clustering is considered proportional if there is no coalition of at least n k agents that can agree on a new representative which strictly reduces the loss of each coalition member.

The above-mentioned work solely focuses on centroid clustering which always selects a representative for each of the clusters. However, there are also settings where clusters do not have natural representatives, for example in settings of team formation [Hajduková, 2006, Woeginger, 2013] or clustered federated learning [Sattler, Müller, and Samek, 2021]. In these non-centroid (representative-free) settings, an agent’s loss depends only on distances to other members of its cluster. Caragiannis, Micha, and Shah [2024] extended proportional fairness to this setting: A (non-centroid) clustering is in the α-core if no coalition of at least n k agents can form a new cluster in which each member’s loss decreases by a factor of more than α.

Among other possible definitions of loss, Caragiannis et al. [2024] consider the max-loss objective, where each agent’s loss in a cluster is the maximum distance to another agent in the cluster. For this objective, they show that there always exists a clustering which is in the 2-core. Moreover, they state the following open problem.

Question 1 (Caragiannis et al. [2024]). For the max-loss objective, does there always exist a clustering in the 1-core?

The same question was repeated by Cookson, Shah, and Yu [2024, page 8](who studied how to combine both proportional centroid and non-centroid clustering).

As our main contribution, we answer the question in the negative.

Theorem 1. For every k ≥ 3 there exists a metric space with n ≥ 9 agents, where n is divisible by k, for which no k-clustering is in the α-core under the max-loss objective, for any α < 2 1 5 < 1.149.

The metric space we use to prove Theorem 1 is not Euclidean, whereas most metric spaces considered in the clustering setting are. Using a computer-aided proof, we were able to find a set of points in two-dimensional Euclidean space which yield a slightly smaller lower bound. These points were obtained by optimizing a system of inequalities and are based on the metric space used for proving Theorem 1.

Theorem 2. For k = 3 there is a two-dimensional Euclidean space with n = 9 many agents for which no k-clustering is in the α-core under the max-loss objective, for any α < 1.138.

The points for the counterexample, along with a Python script for verifying the claim, can be found in Appendix A.

Notation. Let (N, d) be a (pseudo)-metric space with a distance function d :

For a clustering C, we write loss i (C) := loss i (C(i)).

We say that a coalition S ⊆ N of agents α-blocks a clustering C if α • loss i (S) < loss i (C) for all i ∈ S. A k-clustering C is in the α-core for α ≥ 1 if there is no α-blocking coalition of size at least n k . We refer to the 1-core simply as the core.

Fix k = 3 and let n ≥ 9 divisible by 3. We create three sets of agents, G 1 , G 2 , and G 3 , which we will henceforth call groups, with

and d(i, j) = 0 for each i, j ∈ G t and each t ∈ {1, 2, 3}. The remaining four agents will be called a, b, c, and w. We have the following distances for all agent-group pairs (as illustrated in Figure 1).

Moreover, we have d(a, b) = 1, and agent w is very far away from all other agents, that is, d(w, i) = M for all i ̸ = w and a sufficiently large M ≫ 1. The remaining undefined distances can be chosen as the lengths of shortest paths on the weighted graph induced by the specified edges.

In our proof, we focus on the five following coalitions of size n k :

Figure 1: Counterexample for non-centroid max-loss: edge labels specify their length; the distance between any two points is the length of the shortest path.

For the sets S 1 , S 2 , S 4 and S 5 , and each agent i within that set, we have

, and loss i (S 5 ) = 2 3 5 .

For S 3 and each i ∈ S 3 \ {a} we have loss i (S 3 ) = 2 while the loss of agent a is loss a (S 3 ) = 1. Let C = (C 1 , C 2 , C 3 ) be any 3-clustering of the gadget. One cluster must contain w, and any such cluster has a loss of M for all its members, unless w is placed alone as a singleton. Observe that the cluster containing w contains at most n k members, otherwise the cluster’s agents without w form a blocking coalition for an α-core, with α ≥ M

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut