anadim

6,080-param transformer achieving 100% accuracy on 10-digit addition. Trained from scratch in 10 minutes.

21
0
100% credibility
Found Feb 22, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A compact AI model and code demonstrating perfect accuracy in adding two 10-digit numbers, highlighting sudden learning breakthroughs.

How It Works

1
🔍 Discover the tiny math AI

You come across a fascinating project showcasing the world's smallest AI that perfectly adds huge 10-digit numbers.

2
📥 Grab the ready brain

Download the pre-trained AI brain that's already mastered addition, ready to use right away.

3
🧮 Test big additions

Feed it pairs of massive numbers like 9999999999 + 9999999999 and see it spit out the exact correct sum every time.

4
🚀 Train your own AI

Start from scratch and teach a fresh AI brain to add, watching it learn in just minutes on your computer.

5
📈 Witness the magic moment

Suddenly, after some practice, the AI flips from confused to perfect, solving every problem flawlessly.

🎉 Own a perfect adder

Celebrate having your very own tiny AI that adds any two 10-digit numbers without a single mistake.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 21 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is smallest-addition-transformer-claude-code?

This Python project packs a 6,080-param transformer that achieves 100% accuracy on 10-digit addition tasks, trained from scratch in about 10 minutes on a laptop GPU. It handles fully autoregressive generation for sums up to 20-digit results, with a pre-trained checkpoint ready for eval or single-inference CLI calls like adding 9999999999 + 9999999999. Developers get a complete pipeline to train, test, and inspect results on 10,000 held-out problems.

Why is it gaining traction?

It stands out as the smallest transformer pulling off perfect 10-digit addition accuracy, spotlighting grokking where performance jumps from zero to 100% mid-training. The quick PyTorch setup—no extras beyond torch—lets devs replicate claude-code experiments fast, with plots showing phase transitions and architecture sweeps. Low-param efficiency hooks those chasing minimal models without sacrificing results.

Who should use this?

ML researchers dissecting grokking or transformer scaling limits on arithmetic benchmarks. Hobbyists building toy models to understand autoregressive training on fixed-length sequences. Educators demoing from-scratch transformer fits for addition in workshops or courses.

Verdict

Solid niche experiment with excellent docs, CLI tools, and a working checkpoint—train it yourself or eval instantly. 1.0% credibility score and 14 stars flag low maturity for anything beyond proofs-of-concept; use for inspiration, not deployment.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.