Abstract
Solving a polynomial system, or computing an associated Gröbner basis, has been a fundamental task in computational algebra. However, it is also known for its notorious doubly exponential time complexity in the number of variables in the worst case. This paper is the first to address the learning of Gröbner basis computation with Transformers. The training requires many pairs of a polynomial system and the associated Gröbner basis, raising two novel algebraic problems: random generation of Gröbner bases and transforming them into non-Gröbner ones, termed as backward Gröbner problem. We resolve these problems with 0-dimensional radical ideals, the ideals appearing in various applications. The experiments show that our dataset generation method is at least three orders of magnitude faster than a naive approach, overcoming a crucial challenge in learning to compute Gröbner bases, and Gröbner computation is learnable in a particular class.
| Original language | English |
|---|---|
| Pages (from-to) | 51-56 |
| Number of pages | 6 |
| Journal | CEUR Workshop Proceedings |
| Volume | 3754 |
| Publication status | Published - 2024 |
| Event | 10th International Symposium on Symbolic Computation in Software Science - Work in Progress Workshop, SCSS 2024 WiP - Tokyo, Japan Duration: 28 Aug 2024 → 30 Aug 2024 |
Keywords
- Gröbner Bases
- Machine Learning
- Transformer