anadim

A transformer that executes a one-instruction Turing-complete computer — two approaches: hand-coded weights (no training) and learned from data

17
2
100% credibility
Found Mar 03, 2026 at 17 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

This project creates transformers that simulate a one-instruction Turing-complete computer called SUBLEQ, either by hand-coding weights or training on data, demonstrating perfect single-step execution and emergent multi-step capabilities like Fibonacci and square roots.

How It Works

1
🔍 Discover a fun AI experiment

You stumble upon a project showing how an AI can act like a tiny computer using just one simple rule.

2
🖥️ Try the online demo

Click the link to open a ready-to-run example in your web browser, no setup needed.

3
🎥 Watch magic happen

See the AI execute a multiplication like 7 times 9 step by step, matching a real computer perfectly.

4
Pick your adventure
🚀
Run demos

Load programs for Fibonacci, division, or square roots and watch the AI compute them flawlessly.

🎓
Train it

Feed simple steps and watch the AI learn to chain them into complex calculations.

5
See amazing results

The AI handles math tricks it was never directly taught, like figuring out square roots.

😍 AI becomes a computer

You're thrilled as this smart helper turns into a full tiny computer right before your eyes.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 17 to 17 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is subleq-transformer?

This Python repo builds a transformer that executes SUBLEQ, a one-instruction Turing-complete computer, using PyTorch. It delivers two approaches: hand-coded weights for analytical execution with no training, and a learned model trained on single-step data that generalizes to multi-step programs like multiplication or Fibonacci. Users get CLI demos, an interactive REPL to step through runs, and full training/eval scripts to experiment with transformer github code.

Why is it gaining traction?

It stands out by proving standard transformers can implement a full computer—either hand-coded for perfect logic or learned from data for emergent multi-step generalization—without custom hacks. Developers love the runnable transformer github example contrasting training vs. construction, with 100% single-step accuracy and demos showing real programs execute flawlessly. The PyTorch setup makes it easy to tweak for your own transformer github repo or tutorial.

Who should use this?

ML researchers dissecting transformer internals, esolang fans building one-instruction machines, educators demoing Turing completeness via neural nets. Perfect for devs exploring how transformers execute code, like prototyping subleq interpreters or vision transformer github variants.

Verdict

Cool proof-of-concept for transformers as computers—try the REPL and demos first. Low 1.0% credibility (17 stars) means it's experimental with basic docs; fork it for personal projects, but don't bet production on it yet.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.