leookun

llm model id mapping

10
14
89% credibility
Found Apr 09, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

This is a web-based tool deployed on Cloudflare that generates custom configurations for the Cursor AI coding application by proxying requests to OpenAI or Anthropic AI services.

How It Works

1
🕵️ Find the helpful tool

You stumble upon this GitHub project that lets you connect your favorite AI brains to your coding helper called Cursor.

2
🚀 Launch it easily

Click a special button to set up your own copy on the web in seconds, no hassle needed.

3
🌐 Open the welcome page

Visit your new web page and see a simple form ready for your details.

4
Enter your AI details

Type in the web address of your AI service, your private password, the model name, choose the style like OpenAI or Anthropic, and pick a thinking level if needed.

5
Generate and test

Hit the button to create ready-to-use settings for Cursor or see a test command to try it out right away.

6
📋 Copy and use

Copy the special web link, model nickname, and password straight into your Cursor app.

🎉 AI coding magic unlocked

Your coding assistant now taps into your chosen AI power seamlessly, making your work faster and smarter.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is cf-cursor-model-id?

This TypeScript Cloudflare Worker proxies OpenAI or Anthropic-style LLM requests for Cursor, mapping real model IDs like gpt-4o-mini to fixed aliases such as openai-123 or claude-123. Hit the root URL for a clean form to input your provider URL, API key, model ID, and type—generate Cursor-ready endpoints and test curls instantly, with transparent forwarding that rewrites models and handles OpenAI reasoning levels (low to xhigh, even forced). Solves plugging custom or local LLMs into Cursor without config hassles, via base64url-encoded paths for providers.

Why is it gaining traction?

One-click Cloudflare deploy from GitHub beats manual proxy setups, and the UI spits out exact Cursor configs plus curl tests—no more guessing endpoints for llm github integration or cursor llm workflows. Supports llm model comparison by aliasing any upstream, with zero key storage for security, plus Anthropic version pinning. Devs dig the npm dev/deploy flow for quick local tweaks on llm github projects.

Who should use this?

Cursor users routing traffic to cheaper providers, local llm github models, or llm model benchmarks beyond official APIs. LLM tinkerers building llm github copilot alternatives or testing llm models list in Cursor. Teams needing llm modelle vergleich without vendor lock-in.

Verdict

Grab it if you're deep into Cursor and custom LLMs—solid docs and tests make it production-ready despite 10 stars and 0.9% credibility score signaling early maturity. Fork and deploy; it'll save setup time until bigger llm github repository options mature.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.