nicedreamzapp

Run Claude Code with local AI on Apple Silicon. 122B model at 41 tok/s with Google TurboQuant. No cloud, no API fees.

27
2
100% credibility
Found Mar 26, 2026 at 27 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

This repository enables running an AI coding assistant locally on Apple Silicon Macs by providing setup scripts, a compatible server, and model download tools for offline use.

How It Works

1
🔍 Discover local AI coding

You hear about a way to run a powerful AI helper for writing and editing code entirely on your MacBook, keeping everything private and free.

2
💻 Check your setup

See if your Apple Mac has enough memory to handle the big AI brain – most recent models with lots of RAM work great.

3
🚀 Run easy setup

Follow simple steps or use the one-click helper to prepare your Mac automatically, feeling excited as it gets ready.

4
📥 Grab the AI model

Let it download the large AI thinking file once, like getting a big app, and it saves it right on your computer.

5
Launch with one click

Double-click the desktop shortcut, and your local AI coding assistant springs to life using your Mac's own power.

6
📱 Use from Mac or phone

Chat with the AI to write code, manage files, or even control your browser, and send commands from your phone if you want.

🎉 Code privately and fast

Celebrate as you get speedy AI help for all your coding projects, offline, no costs, and nothing leaves your Mac.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 27 to 27 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is claude-code-local?

Claude-code-local lets you run Claude Code locally on Apple Silicon Macs using a 122B parameter LLM like Qwen 3.5, hitting 65 tokens per second via Apple's MLX framework—all in Python with no cloud dependency or API fees. It solves the mismatch between Claude Code's Anthropic API and local models by providing a local server you point to with env vars like ANTHROPIC_BASE_URL=http://localhost:4000. Developers get the full Claude Code experience—code editing, project management, browser control—offline and private.

Why is it gaining traction?

It skips proxy overhead that slows Ollama or llama.cpp setups, delivering 7.5x faster real-world tasks at zero cost, plus iPhone control via iMessage for remote use. As a claude code local alternative, it runs claude code for free with local AI, supports real browser sessions for logged-in workflows, and offers easy claude code local install via scripts or desktop launchers. The hook: massive local models outperforming cloud Opus speed on your Mac, no data leaks.

Who should use this?

Backend devs on M-series Max/Ultra Macs (64GB+ RAM) building sensitive client projects offline, like airplane coding sessions. AI tinkerers wanting to run claude code locally with local LLM as a claude code local free option, or teams ditching API bills for claude code local mcp server setups. Avoid if you need top-tier reasoning or lack Apple hardware.

Verdict

Worth trying for Apple Silicon power users seeking a claude code local github gem to run claude code in terminal or docker-like isolation—solid docs and benchmarks make setup straightforward. At 27 stars and 1.0% credibility, it's early-stage and hardware-specific; test on your rig before relying on it daily.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.