Tensor surgery and tensor rank
We introduce a method for transforming low-order tensors into higher-order tensors and apply it to tensors defined by graphs and hypergraphs. The transformation proceeds according to a surgery-like procedure that splits vertices, creates and absorbs virtual edges and inserts new vertices and edges. We show that tensor surgery is capable of preserving the low rank structure of an initial tensor decomposition and thus allows to prove nontrivial upper bounds on tensor rank, border rank and asymptotic rank of the final tensors. We illustrate our method with a number of examples. Tensor surgery on the triangle graph, which corresponds to the matrix multiplication tensor, leads to nontrivial rank upper bounds for all odd cycle graphs, which correspond to the tensors of iterated matrix multiplication. In the asymptotic setting we obtain upper bounds in terms of the matrix multiplication exponent $\omega$ and the rectangular matrix multiplication parameter $\alpha$. These bounds are optimal if $\omega$ equals two. We also give examples that illustrate that tensor surgery on general graphs might involve the absorption of virtual hyperedges and we provide an example of tensor surgery on a hypergraph. Besides its relevance in algebraic complexity theory, our work has applications in quantum information theory and communication complexity.
💡 Research Summary
The paper introduces a novel technique called “tensor surgery” for transforming low‑order tensors into higher‑order tensors while preserving the low‑rank structure of an existing decomposition. The authors focus on tensors that arise from graphs and hypergraphs, denoted Tₙ(G), where each vertex corresponds to a tensor leg and each edge (or hyperedge) corresponds to a summed index shared between the incident legs. The central idea of tensor surgery is to take a tensor t whose rank (or border rank, or asymptotic rank) is known, split one of its legs into several legs, and then take a tensor product with another tensor s that “inserts” new vertices and edges. By carefully tracking how the rank changes under this operation, one can combine the known decomposition of t with the rank increase caused by the surgery to obtain a decomposition of the target tensor t′.
The authors first illustrate the method on the five‑cycle tensor T₂(C₅). Starting from the well‑known Strassen decomposition of the triangle tensor T₂(C₃) (rank 7), they define a linear map φ that splits a leg into three legs and inserts two new edges. Applying φ to each term of Strassen’s decomposition yields a decomposition of T₂(C₅) of size 31, improving on the trivial size 32. The map φ has rank 4 on most basis elements, but on the special element b₀₀ + b₁₁ it reproduces the triangle tensor and contributes rank 7; this worst‑case term is precisely what the “virtual edge” in the surgery picture represents.
Generalising this construction, the paper proves that for any odd integer k, the tensor of the k‑cycle satisfies
R(T₂(C_k)) ≤ 2^{k‑1}.
Previously this bound was known only for k ≤ 5. Moreover, the authors establish two families of asymptotic‑rank (exponent) bounds. First, for odd k and ℓ they show
ω_{k+ℓ‑1} ≤ ω_k + ω_ℓ,
relating the exponent of a larger odd cycle to the sum of exponents of smaller cycles. Second, using the dual exponent α of rectangular matrix multiplication, they obtain
ω_k ≤ k − α·(1 + 1‑α)/(k‑1 + α) ≤ k − α.
Since the best known α satisfies 0.30298 < α ≤ 1, this gives a uniform constant gap between ω_k and k for all odd k. The bounds are tight in the hypothetical case ω = 2, where ω_k would equal k‑1.
The paper also discusses tensor surgery on general graphs, where the operation may require the absorption of “virtual hyperedges.” An explicit example on a hypergraph shows how splitting a leg into many legs and inserting a hyperedge can be handled within the same framework. The authors note that the method works particularly well for sparse graphs; in a companion work they treat dense graphs by a different technique.
Beyond algebraic complexity, the authors connect their results to quantum information theory and communication complexity. Interpreting Tₙ(G) as an (unnormalised) quantum state—vertices as parties and edges as shared EPR pairs (hyperedges as GHZ states)—the tensor rank equals the minimal number of GHZ resources needed to obtain the state under stochastic local operations and classical communication (SLOCC). The asymptotic exponent ω(T₂(G)) measures the optimal rate of converting many copies of the GHZ state into many copies of T₂(G). In communication complexity, the support rank of T₂(C₃) characterises nondeterministic quantum broadcast complexity, and the authors’ upper bounds translate into upper bounds for certain graph‑equality problems.
Overall, tensor surgery provides a systematic way to lift low‑rank decompositions to more complex tensors, yielding non‑trivial rank and exponent upper bounds for a broad class of graph and hypergraph tensors. The technique complements existing flattening and Young‑flattening methods, and opens new avenues for applying algebraic‑complexity tools to quantum information and communication‑complexity problems.
Comments & Academic Discussion
Loading comments...
Leave a Comment