Eunho-J

Local OpenAI & Anthropic compatible API server backed by ChatGPT/Codex OAuth credentials. Python, TypeScript, and Rust.

11
0
100% credibility
Found May 06, 2026 at 11 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

This project creates a local server compatible with OpenAI chat tools, powered by your ChatGPT login for seamless AI access.

How It Works

1
🔍 Discover the bridge

You hear about a handy tool that lets everyday apps use your ChatGPT login like a local AI helper.

2
📱 Sign into the official app

Download the free ChatGPT coding app and log in once to save your access details securely on your computer.

3
📦 Add the simple tool

Install the lightweight bridge app that connects everything together.

4
Start your local AI server

Hit one button to launch your personal AI hub – it springs to life on your machine, ready to go!

5
🔗 Link your favorite apps

Point your coding tools, chat apps, or scripts to your local helper's address.

6
💬 Chat and create

Send questions, get code ideas, generate images, or build conversations – it feels just like chatting with super-smart AI.

🎉 Unlock powerful AI magic

Enjoy instant, top-tier responses from frontier models, all powered by your own login, right from home.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 11 to 11 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is codex-as-api?

Codex-as-api turns your ChatGPT/Codex OAuth credentials into a local OpenAI and Anthropic compatible API server running on localhost:18080. Install via pip, npm, or cargo, point your OpenAI SDK or Claude Code CLI to it, and hit endpoints like /v1/chat/completions or /v1/messages for streaming chats, tool calls, image gen/inspection, and Codex-specific features like prompt caching. Built in Rust with Python/TS ports, it auto-refreshes tokens from ~/.codex/auth.json after a quick codex login.

Why is it gaining traction?

It bridges codex CLI as API to standard SDKs without official keys or cloud dependencies, enabling local OpenAI API clients to tap frontier gpt-5.5 models for coding/research. Devs love proxying Anthropic protocols for Claude Code while unlocking reasoning effort controls and subagent headers—perfect for codex proxy API GitHub workflows or local GitHub Copilot alternatives.

Who should use this?

AI tool builders needing a local OpenAI LLM for Home Assistant or HACS integrations; backend devs swapping remote APIs in local GitHub Actions runners; or CLI hackers running codex pro API GitHub setups offline. Ideal for prototyping local GitHub server instances or testing local OpenAI compatible APIs without billing surprises.

Verdict

Grab it if you have Codex creds and want drop-in OpenAI/Anthropic compatibility—docs and multi-lang installs make it dead simple to spin up. At 11 stars and 1.0% credibility, it's early alpha (solid tests, but watch for edge cases); fork and contribute to mature this local GitHub Copilot alternative.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.