knowsuchagency

Turn any MCP server or OpenAPI spec into a CLI — at runtime, with zero codegen

704
43
100% credibility
Found Mar 09, 2026 at 341 stars 2x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

mcp2cli transforms descriptions of web services or specialized AI tool servers into simple, on-demand commands for easy interaction.

How It Works

1
📖 Discover mcp2cli

You hear about a handy tool that lets AI helpers interact with services without wasting time or money on extra details.

2
🛠️ Set it up on your computer

With one easy step, you add mcp2cli to your computer so it's ready to use anytime.

3
🔗 Connect to a service

You tell mcp2cli about a web service or helper tool you want to use by sharing its address.

4
Explore what you can do

It shows you a simple list of all the actions available, like a menu of helpful options.

5
🎯 Pick and run an action

You choose an action, add any needed details, and run it to get what you need.

6
See smart, fast results

Your request goes through quickly, and you get back clean, easy-to-read information without extras.

🎉 Save time and effort

Now your AI assistant or projects run smoother and cheaper, discovering just what they need when they need it.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 341 to 704 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is mcp2cli?

mcp2cli is a Python CLI tool that turns any MCP server—over HTTP/SSE or stdio—or OpenAPI spec into a dynamic command-line interface at runtime, with zero codegen. Like turning any bike into an ebike or any image into pixel art, it instantly generates subcommands from tool schemas or API endpoints, so you can list tools with `--list`, call them like `mcp2cli --mcp url search --query test`, or pipe JSON bodies via stdin. It solves LLM tool sprawl by slashing the 96-99% token waste from injecting full schemas every turn, using compact `--list` and `--help` discovery instead.

Why is it gaining traction?

No static codegen means new endpoints appear on the next run, unlike generators that need rebuilds—perfect for evolving APIs. It handles auth headers, env vars, caching with TTL, and outputs like pretty JSON, raw text, or token-efficient TOON for LLMs, working with any model via shell calls. Quantified savings (e.g., 99% on 120-tool platforms over 25 turns) hook AI devs tired of context bloat, plus it ships an installable skill for agents like Claude or Cursor.

Who should use this?

AI agent builders chaining multiple MCP servers for tasks like file ops or DB queries. API consumers wanting a quick CLI for OpenAPI specs without SDKs, such as querying petstore endpoints or enterprise REST APIs. Devs turning GitHub repos into prompts or diagrams via exposed APIs, skipping boilerplate.

Verdict

Try it for MCP/OpenAPI-heavy agent workflows—solid docs, 96 tests, and real token benchmarks make it production-ready despite 13 stars and 1.0% credibility score. Early maturity means watch for edge cases in complex schemas, but the runtime magic delivers immediate value.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.