Cmochance

Local desktop gateway that translates OpenAI Codex CLI's Responses API into Chat Completions for Kimi / DeepSeek / Zhipu GLM / Bailian and other OpenAI-compatible providers.

12
1
100% credibility
Found May 06, 2026 at 12 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

A desktop app that lets OpenAI Codex CLI users connect to alternative AI providers like DeepSeek and Kimi by running a local gateway to translate and forward requests.

How It Works

1
📥 Download the app

Find the tool on GitHub and grab the easy installer for your computer.

2
🚀 Install and open

Run the installer, launch the app, and see the friendly dashboard window.

3
🔌 Connect your AI service

Click the plus button, pick a ready preset like DeepSeek, and add your service details.

4
Start forwarding

Switch to the forwarding page and hit the button to connect everything locally.

5
⚙️ Link your coding helper

Update your coding tool's settings to point to the local connection shown in the app.

🎉 Chat with super AI

Open your coding tool, pick models like 'DeepSeek Pro', and get powerful help instantly.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 12 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is codex-app-transfer?

Codex-app-transfer is a local desktop app built in Rust with Tauri that acts as a gateway for OpenAI Codex CLI users. It translates the CLI's Responses API calls into standard Chat Completions format, forwarding them to providers like Kimi, DeepSeek, Zhipu GLM, or Alibaba Bailian via a localhost proxy on port 18080. Developers get a cross-platform desktop UI to manage API keys, model mappings (e.g., gpt-5.5 to real model IDs), ports, and live logs, with system tray support and seamless session history.

Why is it gaining traction?

Unlike general API routers like litellm or Claude-focused switches, it nails Codex CLI specifics: auto-rewriting response IDs, normalizing reasoning effort, caching tool calls, and handling streaming thinking content without breaking multi-turn chats. The Rust/Tauri rewrite delivers tiny signed binaries (27MB) for Windows, macOS, and Linux, plus presets for popular Chinese providers—perfect for dodging OpenAI costs or limits. Users notice instant model picker updates in Codex CLI after one config tweak.

Who should use this?

Codex CLI power users experimenting with local desktop AI providers as a local GitHub Copilot alternative. AI devs chaining tools in terminal workflows who want DeepSeek or Kimi without rewriting scripts. Teams running local GitHub actions runners or local desktop remote setups needing cheap, fast inference via OpenAI-compatible APIs.

Verdict

Grab it if you're locked into Codex CLI but need provider flexibility—v2.0.3 is polished with checksums and bilingual docs, despite 12 stars and 1.0% credibility signaling early maturity. Test on a side project; the proxy is battle-tested against real errors, but watch for edge cases in high-volume use.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.