kapillamba4

MCP server with local vector search for your codebase. Smart indexing, semantic search, Git history — all offline.

20
1
100% credibility
Found Feb 23, 2026 at 11 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

code-memory is a local tool that indexes codebases, documentation, and git history for precise semantic retrieval to enhance AI coding assistants.

How It Works

1
🔍 Discover code-memory

You hear about a helpful tool that lets your AI coding buddy understand your entire project without reading every file.

2
📥 Get it set up

With one simple command, you download and launch the tool on your computer—no complicated steps needed.

3
📁 Point it to your project

You tell the tool where your code folder is, and it quietly learns everything about your files, functions, and changes over time.

4
Your code comes alive

In just a minute, the tool builds a smart map of your project, ready to answer any question about your code.

5
🔗 Connect to your AI assistant

You add a quick note in your AI tool's settings (like Claude or your code editor) to use this new helper.

6
💬 Ask smart questions

Now when you chat with your AI about 'how does login work?' it pulls exact code spots, history, and notes instead of guessing.

🎉 AI gets super helpful

Your AI gives precise answers with the right code snippets, saving time and making your coding faster and easier.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 11 to 20 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is code-memory?

Code-memory is a Python-based MCP server that indexes your local codebase for semantic search, Git history queries, and doc retrieval—all offline using vector embeddings from sentence-transformers. It solves the "code memory" problem in LLMs where dumping full files wastes tokens and hits context limits, delivering precise snippets via tools like `search_code` for definitions/references, `search_docs` for architecture, and `search_history` for commits/blame. Setup takes 1 minute with `uvx code-memory`, no API keys needed.

Why is it gaining traction?

Unlike cloud RAG tools, it's 100% local with hybrid BM25+vector search, cutting token use by 50% on large repos while supporting 15+ languages via AST parsing. Seamless MCP integration hooks into Claude Desktop, VSCode Copilot/Continue, Gemini CLI—fixing "code memory management error" in agent workflows without mcp github token hassles. Standalone binaries for Mac/Linux/Windows make it dead simple for mcp github copilot vscode users.

Who should use this?

Backend engineers navigating monorepos for "where's this function defined?" or "why did this break?" via Git blame. AI tool builders needing reliable code memory vs data memory separation in local agents. Python teams on mcp github python projects tired of grep hell or bloated prompts in Claude.

Verdict

Worth a spin for MCP fans—solid docs and quick indexing shine, but 11 stars and 1.0% credibility signal early days; test on your repo before production. Addresses real code memory size pains constructively.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.