A New General-Purpose Method to Multiply 3x3 Matrices Using Only 23 Multiplications

A New General-Purpose Method to Multiply 3x3 Matrices Using Only 23   Multiplications
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

One of the most famous conjectures in computer algebra is that matrix multiplication might be feasible in not much more than quadratic time. The best known exponent is 2.376, due to Coppersmith and Winograd. Many attempts to solve this problems in the literature work by solving, fixed-size problems and then apply the solution recursively. This leads to pure combinatorial optimisation problems with fixed size. These problems are unlikely to be solvable in polynomial time. In 1976 Laderman published a method to multiply two 3x3 matrices using only 23 multiplications. This result is non-commutative, and therefore can be applied recursively to smaller sub-matrices. In 35 years nobody was able to do better and it remains an open problem if this can be done with 22 multiplications. We proceed by solving the so called Brent equations [7]. We have implemented a method to converting this very hard problem to a SAT problem, and we have attempted to solve it, with our portfolio of some 500 SAT solvers. With this new method we were able to produce new solutions to the Laderman’s problem. We present a new fully general non-commutative solution with 23 multiplications and show that this solution is new and is NOT an equivalent variant of the Laderman’s original solution. This result demonstrates that the space of solutions to Laderman’s problem is larger than expected, and therefore it becomes now more plausible that a solution with 22 multiplications exists. If it exists, we might be able to find it soon just by running our algorithms longer, or due to further improvements in the SAT solver algorithms.


💡 Research Summary

The paper tackles the long‑standing open problem of whether two 3 × 3 matrices can be multiplied using fewer than 23 scalar multiplications in a non‑commutative setting. Since Laderman’s 1976 construction, which uses exactly 23 multiplications, no improvement has been reported, and the existence of a 22‑multiplication algorithm remains unknown. The authors approach the problem by formulating the multiplication as a system of bilinear equations – the so‑called Brent equations – which consist of 729 cubic constraints over the coefficients of the 23 intermediate products.

Their methodology proceeds in several stages. First, each intermediate product i (1 ≤ i ≤ 23) is represented by three 3 × 3 coefficient matrices A(i), B(i) and C(i), encoding the linear combinations of the input entries that are multiplied and the way the result is assembled. The Brent equations enforce that the sum over i of A(i) · B(i) · C(i) reproduces the exact product a·b for all nine output entries. To make the problem tractable, the authors reduce the equations modulo 2, turning the algebraic system into a Boolean satisfiability (SAT) problem. They develop a custom converter that translates the cubic constraints into conjunctive normal form (CNF) suitable for modern SAT solvers.

Armed with this translation, they launch a massive computational campaign: a portfolio of roughly 500 state‑of‑the‑art SAT solvers is run in parallel on a modest cluster. Within a few days a satisfying assignment is found, which corresponds to a concrete set of 23 bilinear products. The authors then “lift” the modulo‑2 solution to modulo 4 (and, by empirical observation, to the integers) using a standard lifting technique, thereby obtaining a solution that works over any ring, not just over the field GF(2).

Two explicit algorithms are presented. The first reproduces Laderman’s original scheme, listed as products P01–P23 with the same linear combinations of a‑ and b‑entries as in the 1976 paper; verification with Maple confirms correctness. The second, novel algorithm also consists of 23 products (P01–P23) but with entirely different linear combinations. Again, symbolic expansion in Maple shows that each of the nine output entries matches the exact matrix product.

A substantial part of the paper is devoted to establishing that the new algorithm is not merely a re‑parameterisation of known solutions. The authors invoke the transformation group described in earlier work (permutations of the product indices, cyclic shifts of the (A,B,C) triples, transposition with reversal, scalar rescaling with a_i b_i c_i = 1, and “sandwiching” by invertible matrices U, V, W). They prove that all these operations preserve the multiset of ranks of the A(i), B(i), and C(i) matrices (the “3 × r rank distribution”). Laderman’s construction contains six rank‑3 matrices; the families identified by Johnson and McLoughlin contain at most one. The new solution contains exactly two rank‑3 matrices (both on the left‑hand side). Since this rank distribution cannot be altered by any allowed transformation, the new algorithm is inequivalent to Laderman’s and to any previously published 23‑multiplication scheme.

The authors conclude that the solution space for 3 × 3 matrix multiplication is richer than previously believed. This increased diversity makes the existence of a 22‑multiplication algorithm more plausible; the same SAT‑based pipeline, perhaps with longer runtimes or more powerful solvers, could eventually discover such a breakthrough. They also suggest that the SAT‑encoding approach may be applicable to other fixed‑size bilinear problems, opening a new avenue for algebraic algorithm discovery.

Overall, the paper contributes (1) a concrete, verified 23‑multiplication algorithm that is provably distinct from all known ones, (2) a scalable SAT‑based framework for solving Brent‑type equations, and (3) empirical evidence that the quest for a 22‑multiplication 3 × 3 algorithm is within reach, provided computational resources and solver technology continue to improve.


Comments & Academic Discussion

Loading comments...

Leave a Comment