Real Preferences Under Arbitrary Norms
Whether the goal is to analyze voting behavior, locate facilities, or recommend products, the problem of translating between (ordinal) rankings and (numerical) utilities arises naturally in many contexts. This task is commonly approached by representing both the individuals doing the ranking (voters) and the items to be ranked (alternatives) in a shared metric space, where ordinal preferences are translated into relationships between pairwise distances. Prior work has established that any collection of rankings with $n$ voters and $m$ alternatives (preference profile) can be embedded into $d$-dimensional Euclidean space for $d \geq \min{n,m-1}$ under the Euclidean norm and the Manhattan norm. We show that this holds for all $p$-norms and establish that any pair of rankings can be embedded into $R^2$ under arbitrary norms, significantly expanding the reach of spatial preference models.
💡 Research Summary
The paper investigates the fundamental problem of converting ordinal rankings into numerical utilities by embedding voters and alternatives into a normed vector space such that the order of distances reflects the order of preferences. This “rank‑preserving embedding” has been studied for Euclidean (ℓ₂) and Manhattan (ℓ₁) distances, where it is known that any profile with n voters and m alternatives can be embedded in ℝ^d whenever d ≥ min{n, m − 1}. However, nothing was known for general p‑norms or for arbitrary norms that are not of the ℓ_p form.
The authors first formalize rank‑preserving embeddings for any norm and then focus on p‑norms (1 ≤ p ≤ ∞). They introduce the “alternative‑rank (AR) embedding”: each voter i is placed at c·e_i (e_i is the i‑th standard basis vector) and each alternative j at the vector (−rk_i(j))_i, where rk_i(j) is the rank of alternative j for voter i. By expanding the p‑norm distance ‖v_i − a_j‖p^p they show that the term involving the scaling constant c dominates the rank‑dependent term when c is sufficiently large. Because the function (c+2)^p − (c+1)^p is strictly increasing for p > 1, a suitable c always exists, guaranteeing that the distance ordering matches the preference ordering for any p‑norm with 1 < p < ∞. The case p = ∞ is handled by noting that the ℓ∞ distance is simply the maximum coordinate, which is also dominated by c + rk_i(j). For p = 1 the same argument fails (the difference becomes a constant), so the authors revert to the known “max‑rank embedding” from Chen et al., which makes the ℓ₁ distance a linear function of the rank.
Next, they address the dimension bound that depends on the number of alternatives m. Using a geometric lemma about separating unit vectors by a hyperplane in ℓ_p space (1 < p < ∞), they construct an embedding into ℝ^{m‑1} that respects the rank ordering. The lemma essentially generalizes the orthogonal separation used in the Euclidean case to arbitrary p‑norms. By placing alternatives on distinct directions within an (m‑1)‑dimensional subspace and voters on scaled basis vectors, they again ensure that the distance ordering coincides with the preference ordering, establishing that d ≥ m − 1 suffices for all p‑norms.
A particularly striking result is that when there are only two voters, any norm— even non‑ℓ_p norms such as a weighted sum of norms—admits a rank‑preserving embedding in ℝ². The construction places the two voters at (c,0) and (0,c) and positions each alternative on the intersection curve of two norm balls centered at the voters. Because norm balls are centrally symmetric, the intersection curve is continuous and can be traversed in any order, allowing the alternatives to be placed to reflect any pair of rankings. This geometric argument avoids heavy algebra and works for arbitrary norms.
The main theorem (Theorem 1) therefore states: for any profile with n voters and m alternatives, and for any norm (including all ℓ_p with 1 ≤ p ≤ ∞), a rank‑preserving embedding exists in ℝ^d whenever d ≥ min{n, m − 1}. The authors also analyze the scaling constant c needed for the AR embedding, showing that as p approaches 1 from above, c grows exponentially in 1/(p − 1), explaining why the ℓ₁ case requires a different construction.
Finally, the paper outlines open problems. While the two‑voter case is settled for all norms in two dimensions, the authors conjecture (Conjecture 1) that the same dimension bound holds for arbitrary norms in higher dimensions, but a proof is currently missing. They suggest that extending the geometric separation techniques or developing new algebraic tools may be required.
Overall, the work significantly broadens the theoretical foundation of spatial preference models. By proving that the classic dimension bound holds for the full family of p‑norms and, remarkably, for any norm when only two voters are present, it opens the door to using non‑Euclidean distance measures in voting theory, facility location, recommender systems, and AI alignment. The constructive nature of the proofs also provides explicit recipes for building such embeddings, which could be valuable for practical algorithm design.
Comments & Academic Discussion
Loading comments...
Leave a Comment