JiuliNuoyi

Responses API ↔ Chat Completions API 双向代理,让任何兼容 Chat Completions 的后端都能驱动 Codex 等需要 Responses API 的工具,零客户端改动。支持Web_search功能(采用模拟方式,需独立设置)

28
0
89% credibility
Found May 07, 2026 at 28 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

CodexRosetta is a proxy that bridges different AI chat APIs, featuring a web dashboard for managing connections, a playground for testing conversations, and optional web search integration.

How It Works

1
🔍 Discover CodexRosetta

You find this handy tool on GitHub that lets any AI chat service work together smoothly.

2
🚀 Start it up

Download and launch the tool on your computer with a simple setup, and a web page opens in your browser.

3
🔑 Connect your AI helper

In the dashboard, add a connection to your favorite AI service by entering a private password and picking models.

4
💬 Start chatting

Switch to the playground, choose a model, type a question, and watch real-time responses stream in like magic.

5
⚙️ Tweak settings

Adjust options like web search or logging to make conversations even smarter and more detailed.

Perfect AI chats

Your AI assistant now handles any conversation style flawlessly, saving time and unlocking new possibilities.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 28 to 28 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is CodexRosetta?

CodexRosetta is a Python proxy server that bridges the responses api vs chat completions api, letting any Chat Completions backend—like OpenAI, Anthropic, or Azure—power tools built for the OpenAI Responses API with zero client changes. It handles bidirectional conversion for requests and streaming SSE events, plus simulates responses api web search by auto-executing tool calls with providers like Tavily or Brave. Deploy via Docker or pip install, hit the /v1/responses endpoint, or use the built-in web UI for testing.

Why is it gaining traction?

It stands out by enabling responses api openai github-style tools on non-OpenAI backends without rewriting clients, including support for responses api structured output, tool calls, and conversation history via optional Redis. The web search loop runs multiple rounds automatically, injecting results seamlessly—perfect for agents needing fresh data. Multi-key management and audit logs add production polish rare in early proxies.

Who should use this?

AI agent developers locked into Responses API clients but wanting cheaper/faster backends like local models or responses api azure openai. Teams building responses api file search or github copilot-like extensions that need responses api mcp compatibility. Proxy users tired of forking code for api chat migrations.

Verdict

Solid early proxy (24 stars) for responses api documentation followers—try it if you need cross-provider flexibility, but watch maturity given the 0.8999999761581421% credibility score. Pair with Starlette for custom extensions; docs and Docker make onboarding fast.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.