Multi-query quantum sums

Multi-query quantum sums
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

PARITY is the problem of determining the parity of a string $f$ of $n$ bits given access to an oracle that responds to a query $x\in{0,1,…,n-1}$ with the $x^{\rm th}$ bit of the string, $f(x)$. Classically, $n$ queries are required to succeed with probability greater than 1/2 (assuming equal prior probabilities for all length $n$ bitstrings), but only $\lceil n/2\rceil$ quantum queries suffice to determine the parity with probability 1. We consider a generalization to strings $f$ of $n$ elements of $\Z_k$ and the problem of determining $\sum f(x)$. By constructing an explicit algorithm, we show that $n-r$ ($n\ge r\in\N$) entangled quantum queries suffice to compute the sum correctly with worst case probability $\min{\lfloor n/r\rfloor/k,1}$. This quantum algorithm utilizes the $n-r$ queries sequentially and adaptively, like Grover’s algorithm, but in a different way that is not amplitude amplification.


💡 Research Summary

The paper studies the problem of computing the modular sum of the values of a function f defined on a finite domain ℤₙ with codomain ℤₖ, i.e., P_f = ∑_{x∈ℤₙ} f(x) (mod k). The classical PARITY problem (the special case k = 2) requires n oracle queries to achieve any advantage over random guessing, whereas a quantum algorithm can solve it exactly with ⌈n/2⌉ queries. Extending this to arbitrary k, the authors first recall the “Uselessness Theorem”: if 2q classical queries are useless, then q quantum queries are also useless. Consequently, ⌊(n−1)/2⌋ quantum queries provide no information for the SUM problem.

The authors illustrate the new techniques on the simplest non‑binary cases: two trits (n = 2, k = 3) and three trits (n = 3, k = 3). For two trits they construct a single‑query algorithm that succeeds with probability 2/3, using an entangled initial state and a specially designed unitary K that does not depend on f. For three trits they give a two‑query algorithm that succeeds with probability 1, employing additional unitaries J₁ and Jᵣ to manipulate the phase information so that after the second oracle call the state encodes the exact sum.

Two lemmas underpin the general construction. Lemma 3 shows that a state of the form
|Aₛ⟩ = (1/√s) ∑_{ℓ=1}^{s} ω^{−ℓA}|ω^{s−ℓ}⟩
has measurement probability s/k of yielding the correct value A, and that the measurement outcome lies within ⌊k/(2s)⌋ of A with probability at least 4/π². Lemma 4 demonstrates that by partitioning the domain into blocks of size r (one can assume r divides n, otherwise use floor/ceil), the phase‑encoded state can be rewritten so that each component depends only on n − r function values, i.e., on all but one element per block.

Combining these lemmas yields the main algorithm (Theorem 5). For any integers n ≥ r ≥ 1, the algorithm uses exactly n − r entangled quantum queries. The procedure is:

  1. Prepare an entangled superposition over the block index r and a set of Fourier‑basis states.
  2. Apply a sequence of operations K·(X⊗I)·O_f·…·Jᵣ·(X⊗I)·O_f·… repeated n − r times, where O_f is the oracle, X shifts the position register, and K, Jᵣ are fixed unitaries independent of f.
  3. After the last query, the second register is exactly in the form of Lemma 3 with s = ⌊n/r⌋. Measuring this register yields the correct sum with worst‑case probability min{⌊n/r⌋/k, 1}. Moreover, even when the outcome is wrong, it lies within ⌊kr/(2n)⌋ of the true sum with probability at least 4/π².

When ⌊n/r⌋ ≥ k (i.e., r ≤ n/k), the algorithm achieves probability 1, because each block can be solved with k − 1 queries and the remaining blocks are handled similarly. For larger r, the success probability degrades to at least 1/k, matching the trivial random guess bound.

The paper also compares this approach to a generalized van Dam algorithm (Theorem 6), which identifies the entire function f with high probability using q queries, where q ≈ n(k−1)/k + O(√n). While van Dam’s method can also compute the sum, it does so by reconstructing the whole function, thus requiring more queries for the same success probability. The authors argue that their algorithm is near‑optimal; they have a formal optimality proof only for the extreme case r = n − 1 (a single query), but empirical evidence suggests optimality across the full range.

In conclusion, the work introduces a novel quantum query framework that reduces the number of required queries for modular sum problems, provides explicit success‑probability guarantees, and demonstrates that adaptive, coherent query sequences can outperform amplitude‑amplification‑based techniques for this class of problems. The results broaden our understanding of quantum query complexity beyond decision problems and open avenues for further exploration of multi‑query quantum algorithms for other algebraic tasks.


Comments & Academic Discussion

Loading comments...

Leave a Comment