mitsuhiko

mitsuhiko / pi-ds4

Public

Run deepseek4 locally on metal right from within Pi

46
3
100% credibility
Found May 08, 2026 at 46 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

This project is an extension for the pi coding agent that enables running a local DeepSeek V4 Flash AI model on macOS with high RAM using the ds4 runtime.

How It Works

1
🖥️ Discover pi-ds4

You hear about a handy add-on that lets your pi coding helper use a powerful local AI brain on your big-memory Mac.

2
📥 Install easily

In your pi chat, you simply tell it to add the extension, and it handles everything.

3
🚀 Unlock local smarts

Your pi now knows about the new super-smart model called deepseek-v4-flash, ready for action.

4
🔄 Refresh pi

You reload pi or restart it, and the new helper is waiting.

5
🗣️ Pick the model

When chatting or asking for code help, choose the local deepseek model.

6
Automatic setup

The first time, it quietly prepares the AI brain in the background, downloading what it needs and starting it up.

🎉 Turbocharged coding

Now you get lightning-fast, private AI responses right on your machine, making coding fun and speedy.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 46 to 46 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is pi-ds4?

pi-ds4 is a TypeScript extension for the Pi coding agent that runs DeepSeek V4 Flash (ds4) locally on your Mac's metal right from within Pi. It registers the ds4/deepseek-v4-flash model, auto-downloads quantized versions (q2 for 128GB RAM, q4 for 256GB+), builds the ds4 server if needed, and starts it on-demand via OpenAI-compatible API at http://127.0.0.1:8000/v1. Use `pi install https://github.com/mitsuhiko/pi-ds4` to add it, then query the model in Pi chats; type `/ds4` for live server logs.

Why is it gaining traction?

It solves the hassle of manual local LLM setup by handling builds, model fetches from Hugging Face, per-process leases, and auto-shutdown via watchdog when idle—keeping your 100k context window responsive without babysitting. Developers dig the seamless Pi integration for ds4 locally, ditching cloud latency for metal-speed inference on beefy hardware. Extras like env overrides (DS4_MODEL_QUANT, DS4_RUNTIME_DIR) let you tweak for custom ds4 checkouts.

Who should use this?

Pi users on M-series Macs with 128GB+ RAM testing local DeepSeek V4 for coding tasks, like generating reasoning-heavy code or long-context analysis. Ideal for AI devs bypassing API limits or running github copilot locally equivalents without vendor lock-in. Skip if you're on lower-spec machines or prefer hosted models.

Verdict

Worth a spin for Pi power users chasing ds4 locally—installs cleanly and delivers fast local inference out of the box. But with just 46 stars and 1.0% credibility score, it's early-stage; docs are solid but expect tweaks as it matures.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.