Win-Hao

Win-Hao / ModelLink

Public

Local proxy for routing third-party API models through Claude Desktop

35
5
69% credibility
Found May 07, 2026 at 35 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

ModelLink is a free desktop app that connects third-party AI models to the Claude Desktop application through a simple visual setup and local bridging.

How It Works

1
🔍 Discover ModelLink

You learn about this free tool that lets you add extra AI helpers to your Claude Desktop chats.

2
📥 Download and set up

Pick the right file for your Mac or Windows computer, install it simply by dragging or unzipping.

3
🚀 Open the app

Launch ModelLink and see its easy window ready to connect your favorite AI services.

4
🔗 Link your AI services

Choose from presets or add your own, like DeepSeek or others, by entering their web address and private password.

5
🧪 Test the connection

Tap test to make sure your new AI service responds happily and is ready to go.

6
Apply the magic

Save your setup and click apply – it automatically prepares Claude Desktop and restarts it for you.

💬 Chat with new AIs

Open Claude Desktop, pick your added models from the list, and enjoy smarter, more varied conversations anytime.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 35 to 35 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is ModelLink?

ModelLink is a Rust-powered local proxy server that routes Claude Desktop requests to third-party AI models like DeepSeek, Kimi, or GLM, bypassing Anthropic's defaults. Developers get a tray-resident app with a visual interface to add providers, test connections, and apply configs—no manual JSON editing needed. It handles multiple APIs, 1M context variants, request logs, and auto-restarts Claude, acting as a local proxy for development similar to a local GitHub Copilot alternative.

Why is it gaining traction?

It stands out with dead-simple setup: pick presets, test APIs live, and push changes to Claude in one click, plus system tray persistence and theme syncing. Unlike clunky config hacks or full local LLM runners, this proxies cloud APIs seamlessly via localhost:5678, with multi-provider switching right in Claude's model picker. The free binaries for macOS and Windows lower the barrier, echoing local proxy server tools for testing.

Who should use this?

Claude Desktop users experimenting with cheaper Chinese LLMs or mixing providers for cost savings. Mac/Windows devs needing a local GitHub actions runner-style proxy for AI workflows, or hobbyists testing local proxy server Windows setups without Docker overhead. Avoid if you're building production apps due to the non-commercial license.

Verdict

Grab it for personal tinkering—35 stars and solid README make it usable out-of-the-box, though the 0.699999988079071% credibility score flags its early stage with no tests visible. Solid hack for Claude power users, but license blocks teams; watch for forks if it matures.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.