Euphie

面向国内京东、百度等 Anthropic / OpenAI 兼容的 Coding Plan API 的轻量级反向代理。在上游服务过载时可自动等待并重试,确保 Claude Code 等客户端不会因接口报错而中断任务。

10
2
69% credibility
Found Apr 18, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

This tool acts as a smart middleman for AI chat services, automatically retrying failed requests during peak times and tracking your usage.

How It Works

1
🕵️ Find the Reliability Booster

You hear about a handy tool that makes AI chat services more dependable by handling busy times smoothly.

2
📥 Bring It Home

Download the simple package to your computer – it's ready to go with everyday setup helpers.

3
⚙️ Pick Your AI Friend

Choose a stable cloud service in the settings so your chats always connect reliably.

4
🚀 Turn It On

Launch the helper service with one click, and watch it start listening for your requests.

5
🔗 Link Your Chats

Tell your favorite AI apps to talk through this helper for smoother experiences.

6
💬 Chat Without Worry

Send messages to AI, and it automatically retries if the service is temporarily overloaded.

7
📊 See Your Usage

Check the built-in dashboard anytime to track how much chatting you've done.

🎉 Seamless AI Magic

You now have reliable, tracked access to powerful AI conversations without interruptions.

Sign up to see the full architecture

6 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is anthropic-proxy?

Anthropic-proxy is a lightweight Go-based reverse proxy that lets you use domestic providers like JDCloud or Baidu as drop-in Anthropic and OpenAI compatible APIs. Point your Claude Code clients or other tools at its endpoint (default :8080), and it forwards requests while auto-retrying on overload errors like "overloaded" or "Too many requests," preventing task interruptions. Enable optional SQLite-backed stats via /stats for tracking token usage across Anthropic or OpenAI protocols, with a simple dashboard.

Why is it gaining traction?

It stands out by handling real-world flakiness from Chinese Anthropic OpenAI compatible endpoints—custom retry rules with jitter ensure smooth operation without client-side hacks. Docker Compose deploys in seconds, YAML config supports multiple providers or custom upstreams, and async token parsing gives instant cost insights via JSON API or UI. Developers grab it for reliable proxying in anthropic proxy api setups where official services choke.

Who should use this?

Backend devs integrating anthropic github claude code skills or agents with JDCloud/Baidu APIs for cost savings. Teams building anthropic github computer use tools or copilot-like features needing an anthropic openai endpoint that doesn't flake on peak loads. Chinese developers dodging rate limits in coding workflows.

Verdict

Solid niche pick for anthropic mcp proxy needs—deploy it if you're hitting overload walls, but with just 10 stars and 0.699999988079071% credibility score, treat it as experimental; fork and test heavily before production. Docs are basic but Makefile/Docker make spinning it up painless.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.