Papers
arxiv:2604.11101

Generating Hadamard matrices with transformers

Published on May 11
Authors:
,
,

Abstract

Transformer neural networks combined with local search in PatternBoost framework construct Hadamard matrices more effectively than traditional methods, especially for sparse combinatorial problems and larger matrix orders.

AI-generated summary

We present a new method for constructing Hadamard matrices that combines transformer neural networks with local search in the PatternBoost framework. Our approach is designed for extremely sparse combinatorial search problems and is particularly effective for Hadamard matrices of Goethals--Seidel type, where Fourier methods permit fast scoring and optimisation. For orders between 100 and 200, it produces large numbers of inequivalent Hadamard matrices, and for larger orders, it succeeds where local search from random initialisation fails. The largest example found by our method has order 252. In addition to these new constructions, our experiments reveal that the transformer can discover and exploit useful hidden symmetry in the search space.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2604.11101
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2604.11101 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2604.11101 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2604.11101 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.