elvismdev

Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.

27
4
100% credibility
Found Feb 19, 2026 at 12 stars 2x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Self-hosted memory server that integrates a popular AI memory library with coding assistants for persistent recall across sessions using local storage.

How It Works

1
🔍 Discover persistent memory for your AI helper

You hear about a simple way to make your AI coding buddy remember your preferences and project details across every chat session.

2
🛠️ Prepare your local helpers

You start a few easy local services on your computer to store memories and create smart summaries, just like setting up a notebook.

3
🔗 Link your AI subscription

Your setup automatically grabs your existing AI login details so it can think and remember without extra hassle.

4
🚀 Launch with one easy command

You add a quick note to your project folder or run a simple line, and everything springs to life on the internet.

5
💭 Chat and tell it to remember

Restart your AI chat, say 'Remember I love TypeScript' or 'Search my coding habits', and it pulls up your past notes instantly.

6
🧠 Explore connected ideas

Optionally turn on relationship mapping to see how your ideas link together, like a mind map of your projects.

🎉 AI remembers forever

Now your AI starts every session knowing your style, tools, and decisions, saving hours of re-explaining and boosting your productivity.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 27 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is mem0-mcp-selfhosted?

This Python project runs a complete, self-hosted mem0 memory server for Claude Code via MCP protocol. It pairs self-hosted Qdrant for vector search, Neo4j for knowledge graphs, and Ollama for embeddings, while routing core LLM tasks to Claude. Developers get persistent memory across sessions—add, search, update, or delete facts like "use PostgreSQL with Prisma"—without cloud lock-in or manual installs via uvx or .mcp.json.

Why is it gaining traction?

Zero-config auth grabs your Claude OAT token, skipping API keys, and supports Ollama or cheap Gemini for graph ops to cut Claude quota burn. Benchmarks prove split-model routing (Gemini extraction + Claude contradiction checks) boosts accuracy while slashing costs 100x versus all-Claude. Docker-ready for self-hosted GitHub Actions runners or Codespaces, it's a lightweight mem0 self-hosted alternative against vendor-hosted memory layers.

Who should use this?

Claude Code users rebuilding context every session, AI agent builders needing scoped memory (per-user/agent/run), or self-hosted GitHub Copilot fans wanting graph-backed recall for coding prefs like "tests with pytest -v". Ideal for solo devs or teams running Ollama/Qdrant/Neo4j stacks avoiding mem0's cloud tier.

Verdict

Grab it if you're deep in Claude Code—docs shine with quickstarts, tests cover edge cases, and telemetry's off. At 12 stars and 1.0% credibility, it's raw but production-viable for self-hosted setups; watch for mem0ai upstream breaks.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.