Gröbner basis computation via learning

Hiroshi Kera, Yuki Ishihara, Tristan Vaccon, Kazuhiro Yokoyama

Research output: Contribution to journalConference articlepeer-review

Abstract

Solving a polynomial system, or computing an associated Gröbner basis, has been a fundamental task in computational algebra. However, it is also known for its notorious doubly exponential time complexity in the number of variables in the worst case. This paper is the first to address the learning of Gröbner basis computation with Transformers. The training requires many pairs of a polynomial system and the associated Gröbner basis, raising two novel algebraic problems: random generation of Gröbner bases and transforming them into non-Gröbner ones, termed as backward Gröbner problem. We resolve these problems with 0-dimensional radical ideals, the ideals appearing in various applications. The experiments show that our dataset generation method is at least three orders of magnitude faster than a naive approach, overcoming a crucial challenge in learning to compute Gröbner bases, and Gröbner computation is learnable in a particular class.

Original languageEnglish
Pages (from-to)51-56
Number of pages6
JournalCEUR Workshop Proceedings
Volume3754
Publication statusPublished - 2024
Event10th International Symposium on Symbolic Computation in Software Science - Work in Progress Workshop, SCSS 2024 WiP - Tokyo, Japan
Duration: 28 Aug 202430 Aug 2024

Keywords

  • Gröbner Bases
  • Machine Learning
  • Transformer

Fingerprint

Dive into the research topics of 'Gröbner basis computation via learning'. Together they form a unique fingerprint.

Cite this