HATSolver: Learning Groebner Bases with Hierarchical Attention Transformers

Reading time: 1 minute
...

📝 Original Info

  • Title: HATSolver: Learning Groebner Bases with Hierarchical Attention Transformers
  • ArXiv ID: 2512.14722
  • Date: 2025-12-09
  • Authors: Mohamed Malhou, Ludovic Perret, Kristin Lauter

📝 Abstract

At NeurIPS, Kera et al. (2024) introduced the use of transformers for computing Gröbner bases, a central object in computer algebra with numerous practical applications. In this paper, we improve this approach by applying Hierarchical Attention Transformers (HATs) to solve systems of multivariate polynomial equations via Gröbner bases computation. The HAT architecture incorporates a treestructured inductive bias that enables the modeling of hierarchical relationships present in the data and thus achieves significant computational savings compared to conventional flat attention models. We generalize to arbitrary depths and include a detailed computational cost analysis. Combined with curriculum learning, our method solves instances that are much larger than those in Kera et al. (2024) .

📄 Full Content

...(본문 내용이 길어 생략되었습니다. 사이트에서 전문을 확인해 주세요.)

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut