Meet chess‑transformers: a playful twist on your classic chess engines, but instead of evaluating millions of positions via search, it reads the move list as text, learns how humans actually play, and predicts the next move in a single transformer go. It’s kind of like training a GPT for chess moves—no brute-force tree search, just pure pattern learning from human games
Ok, after having ported the new maia2 models (https://groups.google.com/g/picochess/c/_zGh_KyDwds) which offer only a decent playing experience because of its restricted opening and endgame knowledge and which are also based on the new transforms architecture, I finally got the transformers-chess project running on PicoChess!
This engine is based on the brilliant chess-transformers project by sgrvinod. I’ve just wrapped it in a friendly UCI shell so we can all have some fun with it.
So what is difference to „old“ lc0/maia1 models (in the meantime newer lc0 models are also transformer based)?
Chess Transformers - The Linguists
Projects like Maia2 or chess-transformers treat chess not as a game of positions, but as a language, where games are sentences and moves are words.
Installation
Like with Maia2 and typical for q pure PyTorch implementation several steps are needed and even some modifications of the original GitHub files were necesscarry in order to run the transformers models with PicoChess 3.3.
pip3 install einops tqdm gdown markdown tabulate Ipython colorama
pip3 install torch --index-url https://download.pytorch.org/whl/cpu
sudo apt install libhdf5-dev liblzo2-dev libblosc-dev
pip uninstall -y numpy tables numexpr blosc2 setuptools wheel Cython
pip install --upgrade pip
pip install --upgrade setuptools wheel Cython
pip install numpy==1.26.4
pip install --no-cache-dir --extra-index-url https://www.piwheels.org/simple tables==3.9.2
pip install regex
Copy the transformers_uci folder to the engines/script_engines directory (containing my uci wrapper transformers_uci.py) and the bash and uci files to the arch64/script folder.
Add this favorites.ini entry:
[script/transformers]
name = chess-transformers by MIT UCI version by Molli
small = trans
medium = transf
large = Transform.
web = Chess-Transformers
elo = 1500
ponder/brain = n
What’s under the hood?
The project ships with a few transformer architectures:
Since version v0.2.0 (Dec 17, 2023), CT‑EFT‑20 became the star: compact, smart, and surprisingly effective .
Important!
When you first load the engine, it needs to "transform" from its dormant state (the raw model file) into its ready-to-fight "robot mode."
My UCI wrapper includes a Warmup uci option (enabled by default) to handle this initial compilation gracefully before the first move directly after having loaded the model. But this means it will last up to 1 minute until the engine is ready (or if you have the engine selected as last engine the PicoChess boot up will last as long as the engine is ready). After this initialization the engine answers immediately and indeed plays an interesting game...
Enjoy
Dirk
P.S.
1) Forgot to mention before you do the pip installation commands you must activate the corresponding virtual environment: eg. pyenv activate picochess-3.9.2
2) Don’t feel like dealing with a complex installation?
Just wait a little bit.
My human-style Leela Chess Zero enhancements are coming soon — drop-in nets, a few engine settings and that’s it - no headaches. 😄