LunkiBR

Token-optimized MCP Server for n8n workflows with Focus Mode

21
2
100% credibility
Found Feb 28, 2026 at 20 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A streamlined helper that enables AI assistants to manage n8n automation workflows with drastically reduced data size, built-in safety checks, automatic backups, and a knowledge library of workflow building blocks.

How It Works

1
πŸ” Discover the helper

You find a smart companion that lets your AI buddy handle your daily task automations more efficiently and safely.

2
πŸ“± Add to your AI app

Open your AI chat app's settings and simply add this helper by pointing it to where your automations live.

3
πŸ”— Link your automations

Connect with a secure passcode so your AI can peek at and tweak your task flows without any hassle.

4
πŸ’¬ Chat and explore

Ask your AI to show your task lists, zoom into busy parts, or whip up new flows using smart suggestions.

5
πŸ›‘οΈ Edit with safety net

Your AI previews changes, spots issues, and saves quick backups before making any updates feel just right.

πŸŽ‰ Automations alive

Your task automations now run perfectly, built and tuned by AI, saving you tons of time and effort.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 20 to 21 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is n8n-mcp-lite?

n8n-mcp-lite is a TypeScript MCP server that connects AI tools like Claude or Cursor to n8n workflows, slashing token usage by 80%+ through simplified JSON payloads. It strips bloat like positions, duplicates, and empty defaults from n8n's verbose exports, while adding focus mode to zoom into workflow sections without loading everything. Developers get a lean bridge for AI-driven n8n edits, with tools for scanning, versioning, and testing nodes.

Why is it gaining traction?

Unlike the standard n8n MCP server, which balloons to 600k+ tokens on 78-node workflows, this lite version delivers 16k tokens max, plus smart summaries, ghost payload hints from executions, and a 1,236-node knowledge DB for suggestions. Security preflight blocks bad mutations upfront, auto-versioning enables rollbacks, and dry-run node testing avoids production risks. The hook: AI agents finally handle complex n8n workflows without token limits killing context.

Who should use this?

n8n power users building AI agents or automations in Claude Desktop, Cursor, or Antigravity, especially those wrestling large workflows with 30+ nodes, switches, or LangChain tools. Ideal for ops engineers iterating via chat, webhook integrators needing payload schemas, or devs prototyping without full exports.

Verdict

Try it if you're gluing n8n to AIβ€”solid docs, benchmarks, and 26 tools make it production-ready despite 15 stars and 1.0% credibility score. Early maturity means watch for edge cases, but token savings alone justify a spin-up via npx.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.