TiaraBasori

将本地 OpenCode 运行时转换为 OpenAI 兼容 API 网关。在任何 OpenAI 客户端中使用免费模型 (GPT, Nemotron, MiniMax)。

11
0
80% credibility
Found Apr 11, 2026 at 11 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
JavaScript
AI Summary

This project bridges a local free AI runtime (OpenCode) to provide an OpenAI-compatible interface for using free models like GPT and others in standard AI clients.

How It Works

1
🔍 Discover Free AI Power

You hear about a way to use powerful free AI models like GPT in your favorite chat apps without paying.

2
📥 Grab the Helper Tool

You download the simple bridge tool that makes free AI work just like paid services.

3
🔒 Set Your Private Password

You choose a secure password to keep your AI setup personal and safe.

4
🚀 Launch with One Click

You start the bridge, and it automatically sets up the free AI helper in seconds—everything springs to life!

5
🔗 Connect Your Chat App

You tell your existing AI chat tool to use this local bridge, and it works seamlessly.

6
💬 Start Chatting

You ask questions, and the AI responds with smart, streaming answers just like premium services.

🎉 Enjoy Unlimited Free AI

Now you have endless access to advanced AI conversations in any app, all for free and running safely on your computer.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 11 to 11 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is opencode2api?

opencode2api turns your local OpenCode runtime into an OpenAI-compatible API gateway, letting you plug free models like GPT, Nemotron, and MiniMax into any OpenAI client without changing a line of code. Built in JavaScript with Node.js and Docker support, it exposes standard endpoints like /v1/models, /v1/chat/completions, and /v1/responses, complete with streaming SSE output and reasoning controls via reasoning_effort. Developers get a drop-in opencode openai api replacement that bypasses subscriptions using opencode openai api key auth.

Why is it gaining traction?

It stands out by auto-launching the OpenCode backend in Docker for one-command deploys, while handling model resolution (e.g., gpt5-nano aliases) and disabling tools by default for safe local use. Unlike basic proxies, it supports full streaming and integrates seamlessly with tools expecting opencode openai compatible api, making it a quick win for opencode github copilot models or opencode github integration without vendor lock-in. The configurable env vars for timeouts, cleanup, and prompt modes add polish that users notice in production.

Who should use this?

Local AI tinkerers building opencode github agents or opencode github apps who want OpenAI clients like Cursor or VS Code extensions to hit free models via opencode openai compatible provider. Teams evaluating opencode github copilot enterprise alternatives or opencode github mcp setups to cut costs on opencode with openai subscription dependencies. Frontend devs wiring up chat UIs or backend services needing opencode openai codex-style completions without cloud bills.

Verdict

Grab it if you need instant opencode openai login for local free models—solid docs and Docker make setup trivial, despite 11 stars signaling early maturity. At 0.800000011920929% credibility, it's promising for prototypes but watch for stability in high-load scenarios; test thoroughly before prod.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.