codeproxy-ai

codeproxy-ai / cli

Public

@codeproxy/cli is a local proxy server that converts any Chat Completions or Anthropic Messages API into the OpenAI Responses API format. It lets Codex, Claude Code, or any Responses-API client use models from DeepSeek, GLM, Kimi, and more.

10
1
100% credibility
Found May 14, 2026 at 12 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A local proxy server that converts various AI chat APIs into a unified format compatible with OpenAI-style clients like Codex.

How It Works

1
๐Ÿ” Discover the Bridge

You hear about a simple local helper that lets your favorite coding assistant use any AI brain, like DeepSeek or Claude, without hassle.

2
โš™๏ธ Start It Up

Run one easy command with your chosen AI service details, and your personal bridge launches on your computer in seconds.

3
๐Ÿ”Œ Connect Your Assistant

Tell your coding tool to talk to your local bridge instead, and it instantly understands the new AI like magic.

4
๐Ÿ’ฌ Chat and Create

Send your questions or code ideas, and watch as your bridge smoothly translates back and forth for perfect results.

5
๐Ÿš€ Switch Brains Easily

Swap to different AIs or handle images automatically with fallback options, keeping everything fast and flexible.

โœ… Perfect AI Flow

Your coding assistant now works flawlessly with any AI model, saving time and unlocking better creativity every day.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is @codeproxy/cli?

This TypeScript CLI spins up a local proxy server that translates Chat Completions or Anthropic Messages APIs into the OpenAI Responses API format, letting tools like Codex or Claude Code tap into models from DeepSeek, GLM, Kimi, and others without client changes. Fire it up with `npx @codeproxy/cli --base-url https://api.deepseek.com/v1 --model deepseek-v4-flash --apikey sk-your-key`, then point clients at `http://127.0.0.1:8787/v1/responses`. It supports JSON configs for multi-upstream switching, image dropping for text-only models, and fallbacks to vision-capable endpoints.

Why is it gaining traction?

Unlike direct API wrappers, it maintains full streaming and tool-calling compatibility while auto-inferring formats from endpoints, saving devs from custom provider configs in tools like Codex CLI. Features like per-upstream reasoning effort, thinking budgets, and CLI overrides make swapping models (e.g., via GitHub Copilot CLI setups) dead simple. The npx-first approach and GitHub Actions-friendly packaging hook users tired of API silos.

Who should use this?

Codex CLI users on Linux, Ubuntu, or Windows experimenting with cost-effective models beyond OpenAI/Anthropic. Devs building GitHub Copilot extensions or local AI agents who need seamless multi-provider routing without rewriting clients. Teams cloning GitHub repos for AI prototyping via gh CLI.

Verdict

Grab it if you're in the niche of Responses-API clients needing broader model accessโ€”docs are solid, tests aim high (95% coverage), and config flexibility shines. With just 10 stars and 1.0% credibility, it's early-stage; test thoroughly before production, but promising for cli github copilot workflows.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.