Combinatorial Optimization using Comparison Oracles

Combinatorial Optimization using Comparison Oracles
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In linear combinatorial optimization, we aim to find $S^* = \arg\min_{S \in \mathcal{F}} \langle w,\mathbf{1}_S \rangle$ for a family $\mathcal{F} \subseteq 2^U$ over a ground set $U$ of $n$ elements. Traditionally, $w$ is known or accessible via a value oracle. Motivated by practical applications involving pairwise preferences, we study the weaker and more robust comparison oracle, which for any $S, T \in \mathcal{F}$ reveals only if $w(S) <, =, \text{ or } > w(T)$. We investigate the query complexity and computational efficiency of optimizing in this model. We present three main contributions. (1) Query Complexity: We establish that the query complexity over any arbitrary set system $\mathcal{F} \subseteq 2^U$ is $\tilde{O}(n^2)$. This demonstrates a fundamental separation between information and computational complexity, as the runtime may still be exponential for NP-hard problems. (2) Algorithmic Frameworks: We develop two general tools. First, a Dual Ellipsoid framework establishes an efficient reduction from optimization to certification. It shows that to optimize efficiently, it suffices to efficiently certify a candidate’s optimality using only comparisons. Second, Global Subspace Learning (GSL) sorts all feasible sets using $O(nB \log(nB))$ queries for integer weights bounded by $B$. We efficiently implement GSL for linear matroids, yielding improved query complexities for problems like $k$-SUM, SUBSET-SUM, and $A+B$ sorting. (3) Combinatorial Applications: We give the first polynomial-time, low-query algorithms for classic problems, including minimum cuts, minimum weight spanning trees (and matroid bases), bipartite matching (and matroid intersection), and shortest $s$-$t$ paths. Our work provides the first general query complexity bounds and efficient algorithmic results for this fundamental model.


💡 Research Summary

This paper studies linear combinatorial optimization when the algorithm has access only to a comparison oracle: given two feasible sets S and T, the oracle returns whether w(S) < w(T), w(S) = w(T), or w(S) > w(T), where w is an unknown weight vector on the ground set U of size n. The goal is to find the optimal feasible set S* = arg min_{S∈𝔽} w(S). The authors address two fundamental questions: (i) how many comparison queries are sufficient to identify S* (information or query complexity), and (ii) can this be done efficiently in time (computational complexity).

1. Query Complexity.
The first main result shows that for any family 𝔽 ⊆ 2^U, Õ(n²) comparison queries suffice to recover the optimal set, regardless of the size or structure of 𝔽. The proof adapts the “inference dimension” technique of KLMZ17 to linear optimization, introducing the notion of conic dimension: the smallest k such that the set of indicator vectors {1_S : S∈𝔽} has conic dimension k. For the Boolean hypercube {0,1}ⁿ, this conic dimension is O(n log n), leading to the Õ(n²) bound. More generally, for any point set P ⊆ ℝᵈ with conic dimension k, the optimal point can be found with O(k log k log|P|) comparisons (Theorem 1.2). This establishes a clear separation: even NP‑hard problems can be identified with near‑quadratic queries, yet may still require exponential time under standard complexity assumptions.

2. Optimization via Certification (Dual Ellipsoid).
The second contribution is a reduction from optimization to a certification problem. The authors define a Conic‑Certification Oracle (CCO) that, given a candidate solution x, either certifies that x is optimal with respect to the unknown w or provides a separating hyperplane (a violated inequality). Using a variant of the ellipsoid method—called the Dual Ellipsoid algorithm—they show that if a CCO can be implemented efficiently using only comparisons, then the optimal solution can be found with only O(d³·⌈log|P|⌉) calls to the CCO and polynomial additional work (Theorem 1.3). For Boolean linear optimization, they construct a deterministic CCO, yielding a deterministic algorithm that uses only poly(n) comparison queries.

3. Global Subspace Learning (GSL) for Bounded Integer Weights.
When the weight vector w is integer‑valued with |w(e)| ≤ B, the authors present a more powerful algorithm called Global Subspace Learning. The key observation is that if two feasible sets S and T have equal weight, then the difference vector 1_S − 1_T lies in the orthogonal complement of w. By repeatedly comparing pairs of sets, the algorithm collects such difference vectors, builds their span A, and uses the orthogonal projection onto A to infer the weight class of many other sets without further queries. This yields an O(n B log(n B)) query algorithm that fully sorts all feasible sets by weight (Theorem 1.4). When 𝔽 consists of bases of a rank‑k linear matroid represented by a k × n matrix, the same technique leads to a polynomial‑time algorithm that sorts the bases and finds the minimum‑weight basis using O(n B log (n B)) queries (Theorem 1.5). The GSL framework improves upon previous O(n²) bounds for problems such as k‑SUM, SUBSET‑SUM, and sorting of sums A + B when B = o(n).

4. Concrete Algorithms for Classical Combinatorial Problems.
Leveraging the two frameworks, the paper provides low‑query, polynomial‑time algorithms for several classic problems:

  • Minimum Cut in Unweighted Graphs. A randomized algorithm computes the exact minimum cut with Õ(|V|) cut‑comparison queries and O(|V|²) time (Theorem 1.6). The same set of queries also enables reconstruction of the entire edge set (Theorem 1.7), except for a few degenerate small graphs where reconstruction is information‑theoretically impossible.

  • Weighted Minimum Cut with Few Distinct Degrees. If edge weights lie in


Comments & Academic Discussion

Loading comments...

Leave a Comment