Width-parameterized SAT: Time-Space Tradeoffs
Width parameterizations of SAT, such as tree-width and path-width, enable the study of computationally more tractable and practical SAT instances. We give two simple algorithms. One that runs simultaneously in time-space $(O^(2^{2tw(\phi)}), O^(2^{tw(\phi)}))$ and another that runs in time-space $(O^(3^{tw(\phi)\log{|\phi|}}),|\phi|^{O(1)})$, where $tw(\phi)$ is the tree-width of a formula $\phi$ with $|\phi|$ many clauses and variables. This partially answers the question of Alekhnovitch and Razborov, who also gave algorithms exponential both in time and space, and asked whether the space can be made smaller. We conjecture that every algorithm for this problem that runs in time $2^{tw(\phi)\mathbf{o(\log{|\phi|})}}$ necessarily blows up the space to exponential in $tw(\phi)$. We introduce a novel way to combine the two simple algorithms that allows us to trade \emph{constant} factors in the exponents between running time and space. Our technique gives rise to a family of algorithms controlled by two parameters. By fixing one parameter we obtain an algorithm that runs in time-space $(O^(3^{1.441(1-\epsilon)tw(\phi)\log{|\phi|}}), O^*(2^{2\epsilon tw(\phi)}))$, for every $0<\epsilon<1$. We systematically study the limitations of this technique, and show that these algorithmic results are the best achievable using this technique. We also study further the computational complexity of width parameterizations of SAT. We prove non-sparsification lower bounds for formulas of path-width $\omega(\log|\phi|)$, and a separation between the complexity of path-width and tree-width parametrized SAT modulo plausible complexity assumptions.
💡 Research Summary
The paper investigates the computational trade‑offs between time and space for SAT when the instances are parameterized by graph‑theoretic width measures, namely tree‑width (tw) and path‑width (pw). The motivation stems from the observation that many real‑world SAT instances have small width, making width‑parameterized algorithms practically relevant. Alekhnovich and Razborov (2002) previously gave algorithms that run in (2^{O(tw)}) time and (2^{O(tw)}) space, and asked whether the space requirement could be reduced to polynomial while preserving a sub‑exponential time bound.
The authors first present two “simple” baseline algorithms. The first is a classic dynamic‑programming (DP) approach that, given a tree decomposition of width tw, enumerates all assignments for each bag and combines results bottom‑up. This algorithm runs in (O^(2^{2tw})) time and (O^(2^{tw})) space, matching the Alekhnovich–Razborov bound but with an explicit, easy‑to‑implement description. The second algorithm is recursive: at each recursion step a bag is chosen, all truth assignments to the variables in that bag are enumerated, and the formula is simplified accordingly. The recursion proceeds on the resulting independent sub‑instances. This yields a space‑efficient algorithm that uses only polynomial space, at the cost of a time factor (O^*(3^{tw·log|φ|})). The logarithmic factor originates from the need to enumerate assignments for each bag and from the size of the formula |φ| (number of clauses plus variables).
To move beyond the two extremes, the authors introduce a novel “infinite family of proof systems” that combine the DP and recursive strategies in a parameterized way. Two free parameters control the trade‑off: an integer k > 2 that determines the complexity of the inference rules (larger k means more aggressive branching, reducing time but increasing the amount of intermediate information stored), and a real number ε ∈ (0,1) that governs the discretization of the assignment space (larger ε allows more space to be used, reducing the time exponent). By varying these parameters, they obtain a continuum of algorithms with running time and space given by
\
Comments & Academic Discussion
Loading comments...
Leave a Comment