Voboril, F., Peruvemba Ramaswamy, V., & Szeider, S. (2025). Generating Streamlining Constraints with Large Language Models. Journal of Artificial Intelligence Research, 84, Article 16. https://doi.org/10.1613/jair.1.18965
E192-01 - Forschungsbereich Algorithms and Complexity E056-13 - Fachbereich LogiCS E056-23 - Fachbereich Innovative Combinations and Applications of AI and ML (iCAIML)
-
Journal:
Journal of Artificial Intelligence Research
-
ISSN:
1076-9757
-
Date (published):
29-Oct-2025
-
Number of Pages:
20
-
Publisher:
AI ACCESS FOUNDATION
-
Peer reviewed:
Yes
-
Keywords:
constraint programming; constraint satisfaction; satisfiability; problem solving
en
Abstract:
Streamlining constraints (or streamliners, for short) narrow the search space, enhancing the speed and feasibility of solving complex constraint satisfaction problems. Traditionally, streamliners were crafted manually or generated through systematically combined atomic constraints with high-effort offline testing. Our approach utilizes the generative capabilities of Large Language Models (LLMs) to propose effective streamliners for problems specified in the MiniZinc constraint programming language and integrates feedback to the LLM with quick empirical tests for validation. Evaluated across seven diverse constraint satisfaction problems, our method achieves substantial runtime reductions. We compare the results to obfuscated and disguised variants of the problem to see whether the results depend on LLM memorization. We also analyze whether longer offline runs improve the quality of streamliners and whether the LLM can propose good combinations of streamliners.