atalovesyou

Use your Claude Max/Pro subscription with any OpenAI-compatible client. Saves $1000s compared to API keys.

125
57
100% credibility
Found Feb 06, 2026 at 38 stars 3x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

This project creates a local bridge that lets users with a Claude Max subscription access Claude AI models through standard chat tool interfaces without additional per-use fees.

How It Works

1
💡 Discover the savings

You have a pricey monthly Claude Max subscription but notice extra fees when using it in your coding apps, then find this free connector to use it everywhere without added costs.

2
🔗 Link your subscription

You use the simple Claude desktop helper to securely connect your Max account, so everything knows you're a paying customer.

3
🚀 Turn on the bridge

You launch the connector with a quick start, creating a private local doorway for your apps to reach Claude.

4
⚙️ Hook up your favorite app

In tools like your coding sidekick or chat helper, you add the local address and a dummy passcode to switch to Claude.

5
💬 Chat like magic

You type a question in your app and watch Claude's smart replies stream in real-time, just as smooth as before.

Unlimited AI power

Now your Max subscription works in every compatible app at flat-rate cost, saving you money on every conversation.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 38 to 125 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is claude-max-api-proxy?

This TypeScript proxy turns your Claude Max/Pro subscription into an OpenAI-compatible API endpoint, routing requests through the Claude Code CLI to bypass per-token costs and save $1000s. Developers point any OpenAI client—like Continue.dev or Clawdbot—to localhost:3456/v1/chat/completions, getting streaming responses from Opus, Sonnet, or Haiku models with session context preserved. No extra API keys needed; just install the CLI, auth login, and run the standalone server.

Why is it gaining traction?

It unlocks Claude Max Pro plan's flat $200/month pricing for API-heavy workflows, dodging claude max pro price hikes on direct tokens while handling claude max prompt length up to 200k. Streaming SSE, model listing, and zero-config auth make it drop-in for tools craving claude github integration like pr review bots or github actions. The claude max proxy hook? Pair it with claude github copilot alternatives for private repo analysis without OAuth blocks.

Who should use this?

AI tool builders integrating claude github connector for pr review or mcp in actions; devops scripting claude github app for private repo audits; indie hackers on claude max pro limit testing long prompts in custom clients. Perfect for Continue.dev users or Clawdbot fans wanting claude max prompts without token burn.

Verdict

Grab it if you have Claude Max—solid docs and easy setup make it usable now, despite 62 stars and 1.0% credibility signaling early maturity. Test locally before production; lacks tests but delivers promised savings.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.