nanogpt-community

A local proxy for OpenCode and similar OpenAI-compatible coding clients that works around NanoGPT’s often unreliable native tool calling. Instead of relying on NanoGPT to return proper tool calls directly, it sends a stricter text-based tool format upstream and converts that back into normal OpenAI-style tool calls for the client.

18
2
100% credibility
Found Mar 14, 2026 at 18 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
JavaScript
AI Summary

NanoProxy is a compatibility bridge that enhances tool-calling reliability between AI coding clients and NanoGPT by using a robust text protocol.

How It Works

1
🔍 Hear about NanoProxy

You learn about a handy fix that makes AI coding buddies work smoother with tricky smart models by handling their instructions better.

2
📥 Grab the files

You download the simple set of files to your computer to get started.

3
Pick your way
🔌
Plug into OpenCode

Edit a settings file with the exact path and restart your app.

🚀
Start a helper service

Run one command to launch a quiet background helper on your machine.

4
🔗 Link it up

Tell your coding app to chat through this helper instead of directly—everything connects with your usual details.

5
💬 Start using tools

Now ask your AI to read files, write code, or run tasks, and it follows through reliably without glitches.

Perfect results

Your AI completes complex coding jobs smoothly, saving you time and frustration.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 18 to 18 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is NanoProxy?

NanoProxy is a JavaScript-based local proxy server that fixes unreliable tool calling in NanoGPT for OpenAI-compatible coding clients like OpenCode. It intercepts requests, swaps native tools for a strict text format that NanoGPT handles reliably, then converts responses back to standard OpenAI tool calls. Run it standalone at http://127.0.0.1:8787, as an OpenCode plugin, or via Docker for quick local proxy server testing on Windows or anywhere.

Why is it gaining traction?

It delivers seamless tool reliability without client changes or custom providers, unlike raw NanoGPT setups that flake on complex calls. Debug logging, health checks at /health, and env vars like BRIDGE_MODELS let you tweak for specific models, making it a solid local proxy for development. As a nano proxy github tool, it stands out for batching up to 5 calls per turn and fallback recovery from empty responses.

Who should use this?

Backend devs using OpenCode with NanoGPT for code editing tasks, or anyone building local GitHub Copilot alternatives via OpenAI APIs. Ideal for local GitHub actions runners needing stable tool integration, or testing local proxy server setups mimicking Cisco local-proxy-arp in dev workflows.

Verdict

Worth a spin for NanoGPT users hitting tool walls—docs are thorough, self-tests pass cleanly, and Docker simplifies local proxy Windows/Android testing. At 18 stars and 1.0% credibility, it's early but functional; monitor for maturity before production.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.