yeemio

yeemio / owlcc-byoscc

Public

Local-first Claude Code proxy for running the TypeScript source against your own LLM backends

15
3
69% credibility
Found Apr 11, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

OwlCC is a local proxy that runs the Claude Code interface against user-provided local AI models without requiring cloud API access.

How It Works

1
🔍 Discover a free coding helper

You hear about OwlCC, a way to use a powerful coding assistant on your own computer without monthly fees or sending code to the cloud.

2
📥 Download the helper

Grab the simple program that makes your local AI work with the coding assistant's screen.

3
🧩 Add the coding screen files

Place the coding assistant's screen files in a special folder so everything looks familiar.

4
🧠 Wake up your local AI

Start your own AI brain on your computer, like Ollama or a similar friendly helper.

5
⚙️ Connect with a wizard

Run a quick setup that finds your AI and prepares everything automatically.

6
🚀 Launch your private coder

Open the coding assistant and watch it think and code using your local AI, just like the online version.

Code privately forever

Enjoy coding with full privacy, switch AI brains anytime, and never pay API fees again.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 15 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is owlcc-byoscc?

Owlcc-byoscc is a local-first GitHub proxy that lets you run Claude Code's TypeScript source against your own LLM backends like Ollama or vLLM, skipping Anthropic's API entirely. You supply the Claude Code source tree, and it translates the Anthropic protocol to OpenAI-compatible calls, firing up the full TUI with tools, commands, and web search via local SearXNG. Developers get a privacy-focused coding assistant that stays on their machine.

Why is it gaining traction?

It ditches $20/month Pro fees and cloud dependency for free, offline-capable model switching mid-conversation via `/model`, plus 49 auto-injected skills for tasks like Cloudflare deploys or ASP.NET debugging. The CLI offers daemon mode, session resuming, Prometheus metrics, and training data export, matching official features while adding circuit breakers and local web search. TypeScript users appreciate the unmodified upstream TUI running against any backend.

Who should use this?

AI-curious backend devs building Java projects or debugging TypeScript codebases who want Claude-level tools without vendor lock-in. Local-first enthusiasts in air-gapped environments or privacy-focused teams evaluating Qwen/Llama models for code gen. Those collecting session data for fine-tuning or needing hot-swappable backends beyond official Claude.

Verdict

Grab it if local-first Claude Code appeals—solid docs, 154 tests, and smoke suite make setup reliable despite 15 stars and 0.70% credibility score signaling early maturity. Pin your upstream source and test with Ollama first; skip for production needing Opus reasoning.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.