wanghao9610

Run Claude on multiple AI providers (Vibe Coding with Codex)

14
1
100% credibility
Found May 08, 2026 at 14 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
JavaScript
AI Summary

A local gateway proxy that routes Claude Desktop and Claude Code requests to various AI providers using familiar Claude model names.

How It Works

1
📰 Discover the Claude Model Proxy

You hear about a simple tool that lets your Claude app use powerful AI models from different services without changing how you chat.

2
📥 Download and prepare

Grab the files from the project page and get ready to set it up on your computer.

3
🔗 Connect your AI services

Link up the AI accounts you want to use by adding their access details, so the tool knows where to send your requests.

4
▶️ Start the helper

Run the easy starter script to launch your personal AI switchboard on your local machine.

5
⚙️ Point Claude to the switchboard

In your Claude Desktop or Code settings, set it to connect to your local helper using a simple address like your computer's own loopback.

6
💬 Pick a model and chat

Choose from Claude-style model names in your app, and start having conversations powered by your chosen AIs.

🎉 Enjoy enhanced AI talks

Now you seamlessly use top AI brains through your familiar Claude interface, saving time and accessing more options.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 14 to 14 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is claude-model-proxy?

This claude model proxy is a Node.js HTTP server that lets Claude Desktop and Claude Code route requests to providers like DeepSeek, Moonshot, GLM, Xiaomi MiMo, OpenAI, Gemini, Qwen, or Anthropic via a local gateway at http://127.0.0.1:8787. It maps Claude-style model names (e.g., claude-deepseek-v4-pro) to upstream equivalents, rewrites responses for seamless compatibility, and handles text, images, and streaming. Set API keys in .env, run ./start.sh for daemon control (start/stop/status/restart), and check health with curl /healthz to run claude code locally without changing client configs.

Why is it gaining traction?

It plugs directly into Claude Desktop via a buildable MCPB extension and Claude Code via env vars like ANTHROPIC_BASE_URL, enabling tool-use passthrough on Anthropic-compatible backends. Customizable MODEL_MAP and MODEL_ROUTES via env vars allow instant provider swaps, beating generic proxies with Claude-specific smarts. The macOS LaunchAgent setup avoids startup warnings, making local runs reliable.

Who should use this?

Claude Desktop users swapping to cheaper DeepSeek or Kimi models for daily chats. Terminal coders running claude code locally with Qwen or GLM for low-latency autocompletions. AI experimenters testing multi-provider fallbacks without rewriting apps.

Verdict

Worth a spin for Claude ecosystem devs despite 14 stars and 1.0% credibility score—docs are thorough, tests exist, but it's niche and early-stage. Solid for local proxying; production teams may want more battle-testing.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.