simonw

Access OpenAI models via an existing Codex subscription

17
0
100% credibility
Found Apr 24, 2026 at 17 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A plugin for the LLM command-line tool that provides access to advanced OpenAI models using authentication from an OpenAI Codex CLI installation.

How It Works

1
🔍 Discover the magic bridge

You hear about a simple add-on that lets you chat with powerful AI brains using your existing special subscription.

2
📥 Get the chat helper

Download the friendly command-line chat program called LLM if you don't have it yet.

3
🔗 Add the special connector

Install this easy plugin into your chat program to link up your subscription.

4
🔑 Sign into your account

Use the companion Codex app to log in with your OpenAI credentials once.

5
See available AI helpers

Check out the list of smart AI models ready to help with your subscription.

6
💭 Ask a creative question

Type a fun prompt like 'Make an SVG of a pelican riding a bicycle' to one of the AIs.

🎉 Enjoy the amazing output

Get back creative, helpful responses instantly, all powered by your subscription.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 17 to 17 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is llm-openai-via-codex?

This Python plugin for the LLM CLI tool lets you access OpenAI models like gpt-5.5 via your existing Codex subscription, skipping separate API keys. Install it with `llm install llm-openai-via-codex`, ensure the OpenAI Codex CLI is authenticated, then query models via `llm models -q openai-codex` and run prompts like `llm -m openai-codex/gpt-5.5 "Generate code"`. It solves the hassle of managing multiple OpenAI access points by borrowing auth from Codex, ideal for terminal-based access to advanced models including GPT-4o equivalents.

Why is it gaining traction?

It stands out by piggybacking on GitHub Copilot or Enterprise subs for OpenAI Codex access, turning a code tool into a full AI gateway with streaming, tools, and image support. Developers dig the seamless LLM integration—no new billing or keys—making it a quick way to tap premium models from VSCode terminals or scripts. The hook: zero-cost escalation from Copilot to deep research models like o3 or Whisper.

Who should use this?

CLI-heavy devs with GitHub Copilot subscriptions experimenting with OpenAI GPT-5 access or Sora-like generation. Backend engineers automating workflows via terminal OpenAI API calls, or data scientists needing quick Codex model tests without token management. Skip if you're not already in the LLM or Codex ecosystem.

Verdict

Worth a spin for LLM users with Codex auth—Simon Willison's touch adds reliability despite 17 stars and alpha status. 1.0% credibility reflects early maturity and thin tests, so treat as experimental for production.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.