unarbos

unarbos / distil

Public

Distil SN97 — Competitive Model Distillation on Bittensor

19
2
100% credibility
Found Apr 07, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A competitive Bittensor subnet where miners submit smaller distilled versions of a large AI model, and validators score them using KL-divergence to determine a single winner that receives all emissions.

How It Works

1
🔍 Discover the AI Competition

You hear about a fun challenge where people shrink a huge smart AI into a smaller, faster version that thinks just like the original.

2
🧠 Create Your Small AI

You train or find a compact AI model that's much smaller but copies the big one's smarts as closely as possible.

3
📤 Share Your Model Online

You upload your small AI to a public sharing site so everyone can see and download it safely.

4
🔗 Join the Network and Commit

You sign up on the special network with your unique ID and permanently promise your model for the contest.

5
⚔️ Watch the Epic Showdown

Judges automatically test your model against others and the current champion on secret questions to see who matches the big AI best.

6
📊 Check the Live Leaderboard

You visit the dashboard to see scores update in real-time and track if your model climbs to the top.

👑 Claim the Crown and Rewards

Your model becomes the unbeatable king, earning all the prizes while others try to challenge you!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 19 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is distil?

Distil is a Python-based Bittensor subnet (SN97) from Distil Labs on GitHub, where miners competitively distill large models like Qwen3.5-35B-A3B into compact versions under 5.25B params. You upload your distilled model to HuggingFace, commit it on-chain via a simple CLI, and validators score it against the "king" using full-distribution KL divergence on shared prompts—winner takes all emissions. A live web monitor dashboard and public API track scores, commitments, and king dethronements in real-time.

Why is it gaining traction?

It stands out with robust anti-gaming like SHA256 duplicate detection, paired t-tests for dethronements, and efficient GPU eval that only tests challengers plus top-5—no full re-evals every epoch. Developers hook into pre-submission checkers, auto-benchmarking against baselines, and even chat endpoints to query the reigning king model. The distill web monitor provides instant visibility into competitive model distillation races on Bittensor.

Who should use this?

Bittensor miners building distilled models for edge deployment, validators running GPU pods for subnet 97, or ML engineers experimenting with knowledge distillation from giants like Qwen. Ideal for teams wanting to earn TAO rewards through better small-model compression without quantization hacks.

Verdict

Solid for Bittensor devs—excellent docs, CLI tools, and API make it production-ready despite 19 stars and 1.0% credibility score. Early maturity means watch for edge cases, but jump in if you're distilling on-chain.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.