Arkya-AI

Arkya-AI / ember-mcp

Public

Persistent memory for AI — powered by Shadow-Decay and HESTIA scoring

26
4
100% credibility
Found Feb 20, 2026 at 20 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Ember MCP is a local privacy-focused memory system for AI coding tools that persists project context across sessions and automatically prioritizes current information over outdated details.

How It Works

1
🔍 Discover Ember

You learn about Ember, a helpful tool that gives your AI coding buddy a permanent memory of your projects and ideas.

2
📥 Set it up easily

You run one simple command, and Ember finds your AI apps like Claude or Cursor on your computer and connects them automatically.

3
🧠 Ember learns your world

It scans your documents, projects, and chat histories to remember facts, decisions, and tech details from your work.

4
🔄 Restart your AI

You close and reopen your AI chat app, and now it's ready with all your past context loaded.

5
💭 Talk like old friends

In your next chat, the AI recalls your recent changes, like switching libraries, without you explaining again.

🎉 Always up to date

Your AI now feels like a teammate who knows your project's history, suggests fresh ideas, and never forgets what matters.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 20 to 26 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is ember-mcp?

Ember-mcp is a Python-based MCP server that delivers persistent memory for LLMs like Claude Desktop, Cursor, and Copilot, storing facts, decisions, and project context locally across sessions. It solves the pain of repeating your tech stack or past pivots in every chat—your AI remembers migrations from JWT to OAuth or React 17 to 19 without cloud dependencies or API keys. Run it zero-config via `ember-mcp init` and `ember-mcp run`, with tools like `ember_store` and `ember_recall` injected into your LLM.

Why is it gaining traction?

Unlike basic vector stores that regurgitate stale data, ember-mcp server auto-detects drift in knowledge regions, penalizing outdated memories so your AI suggests current architecture. Developers hook on its 100% local privacy (~200MB RAM, no embeddings config), cross-client persistence (Claude to Cursor), and bootstrap scanner that pre-populates from git repos or docs. It's github persistent storage for LLMs, cutting hallucinations in long projects.

Who should use this?

Backend engineers iterating on auth or DB schemas over weeks, frontend devs tracking UI library upgrades, or indie hackers dumping proprietary notes into chats. Ideal for persistent memory llm workflows in Claude Code or Cursor where context spans sessions, not one-offs.

Verdict

Try ember-mcp if you're deep in MCP-enabled LLMs and hate context loss—its tools deliver real persistence with drift smarts. At 14 stars and 1.0% credibility, it's alpha-stage (solid README, no tests visible); prototype locally before production bets.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.