choihyunsus

Optimize AI workflows with Arachne. Automatically assembles the perfect code context (Tree, Target, Deps, Semantic) to fit context windows without noise. Built for efficiency and scale.

29
3
100% credibility
Found Mar 22, 2026 at 29 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
JavaScript
AI Summary

Arachne is a local tool that scans codebases to deliver only the most relevant files and sections to AI assistants for better coding help.

How It Works

1
💡 Discover Arachne

You hear about Arachne because your AI helper gets lost in big projects and misses key code parts.

2
📦 Bring it home

You add Arachne to your project setup in a quick and simple way.

3
📁 Show your project

You point Arachne to the folder holding all your code files.

4
🕷️ Magic mapping happens

Arachne scans your entire project and builds a smart web of connections between files.

5
🔗 Link to your AI

You connect Arachne to your AI coding companion so they can team up.

6
🗣️ Ask for help

You describe a problem like 'fix the login delay' and your AI grabs exactly the right code bits.

Spot-on solutions

Your AI now delivers precise fixes fast, saving time and frustration every time.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 29 to 29 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is n2-arachne?

n2-arachne is a JavaScript tool that optimizes AI workflows by automatically assembling precise code context—project tree, target files, dependencies, and semantic matches—into AI context windows without noise or excess tokens. It indexes your codebase locally with SQLite, handles JS/TS/Python/Rust/Go imports, and delivers curated snippets via the MCP protocol for tools like Claude or Cursor. You install via npm, add an MCP config pointing to your project dir, and query actions like "assemble" for a query and active file.

Why is it gaining traction?

It slashes token usage by up to 99.7% (e.g., 14K from 4.7M tokens), beating naive file dumps while following dep chains and using hybrid BM25+semantic search (Ollama optional). Incremental indexing runs in milliseconds, backups are built-in, and it plugs into the N2 stack for session memory or tool routing—saving real API costs without cloud deps or prompt tweaks. Developers notice faster, accurate AI fixes on large repos, like optimizing GitHub Copilot prompts or data workflows.

Who should use this?

Full-stack devs on monorepos debugging cross-file issues, backend teams tracing deps in Node/Python services, or AI agent builders optimizing GitHub Actions/Copilot for code gen. Ideal for anyone whose AI assistant drowns in irrelevant files during refactors or bug hunts.

Verdict

Try it if you're on MCP-compatible AI tools and want to optimize workflows—strong docs, 104 passing tests, and Apache-2.0 make it low-risk despite 29 stars and 1.0% credibility score. Still early; pair with its ecosystem siblings for production scale.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.