leemysw

Token-Share: A native macOS menu bar LLM API gateway — translate and stream between OpenAI Chat Completions, OpenAI Responses, and Anthropic Messages protocols locally.

12
0
100% credibility
Found Apr 05, 2026 at 12 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Swift
AI Summary

Token Share is a native macOS menu bar app that runs a local gateway to translate and stream requests between different AI chat protocols like OpenAI and Anthropic.

How It Works

1
📱 Spot the handy Mac menu bar app

You hear about Token Share, a simple tool that lets you connect different AI chat services through one local spot on your Mac.

2
💾 Download and add to your menu bar

Grab the app file for your Mac, drag it to Applications, and it appears as a small icon up top—ready whenever you need it.

3
Click the icon to peek inside

Tap the menu bar icon to open a friendly dashboard where you manage your AI connections.

4
Set up your first AI service
Use a ready preset

Choose from popular ones like OpenAI or Anthropic to get started super quick.

🔧
Enter custom details

Type in any service's address if you have a special one in mind.

5
🚀 Flip the switch to start

Hit the big button to launch your local bridge—watch it light up green, meaning everything's connected and humming.

6
🔗 Copy your local web links

Grab the ready-made addresses like localhost spots for chat, responses, or messages to paste into your AI apps.

🎉 Chat freely, switch anytime

Now your apps talk to any connected service smoothly—swap them in seconds from the menu bar for the best results every time.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 12 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is token-share?

Token-share is a native macOS menu bar app that acts as a local LLM API gateway on localhost:9091, translating requests and streaming responses between OpenAI Chat Completions, OpenAI Responses, and Anthropic Messages protocols in any direction. Built in Swift for macOS 14+, it lets you point clients at unified endpoints like /v1/chat/completions or /v1/messages while proxying to upstream providers like Anthropic or OpenAI APIs. No sign-up or telemetry—just configure API keys and base URLs via a dashboard popover.

Why is it gaining traction?

It handles real-time SSE streaming for text deltas, tool calls, and interrupts without client changes, plus unlimited upstream configs with on-the-fly switching and auto model list fetching. The menu bar access and fully local operation make it dead simple for quick tests across providers, standing out from clunky proxies or cloud gateways. Multi-provider support covers OpenRouter, local models like Ollama, and custom endpoints seamlessly.

Who should use this?

MacOS developers prototyping LLM apps who swap between Anthropic and OpenAI APIs often, or need a shared token gateway for chat completions without protocol mismatches. Ideal for AI tool builders testing streaming UIs locally, or teams bridging token share market differences in client-server setups. Skip if you're on non-macOS or need enterprise-scale token sharepoint api.

Verdict

Grab it if you're on macOS and tired of API protocol headaches—early stars (12) and 1.0% credibility reflect its fresh status, but solid docs, Apache 2.0 license, and clean endpoints make it worth a spin for local LLM workflows. Test thoroughly as maturity is low.

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.