GetModus

Personal memory server for AI. One binary, zero cloud, plain markdown.

16
2
100% credibility
Found Apr 04, 2026 at 14 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

modus-memory is a lightweight local server that stores user notes, facts, and preferences in plain markdown files for AI assistants to search and recall across sessions.

How It Works

1
🔍 Discover a memory helper

You hear about a simple tool that lets your AI assistant remember your preferences, projects, and notes forever, without any cloud services.

2
📥 Download and launch

Grab the tiny program for your computer and start it up – it creates a private folder for your memories on your own machine.

3
🔗 Link to your AI chat

Tell your favorite AI app like Claude or Cursor where your memory folder lives, so it can peek inside anytime.

4
💭 Share your first memories

Chat with your AI and say 'remember I love TypeScript' or 'note my auth decisions' – it saves them as easy-to-read notes.

5
🔎 Ask and recall instantly

Later, ask 'what did I decide about projects?' and your AI pulls up exactly what you shared, with smart connections to related ideas.

🧠 AI feels truly personal

Your conversations now build on everything you've shared before, making your AI smarter and more helpful every time.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 14 to 16 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is modus-memory?

Modus-memory is a single Go binary that runs a local personal memory server for AI agents, storing notes, facts, and learnings as plain markdown files you can edit or git-track. It solves the "every chat starts from scratch" problem by exposing 11 MCP tools—like vault_search for BM25 queries, memory_store for facts, and vault_connected for cross-refs—letting clients like Claude Desktop or Cursor fetch relevant context instantly. Setup is brew install plus a config tweak; no cloud, Docker, or databases needed.

Why is it gaining traction?

It crushes cloud memory services on privacy and cost (72x token savings via top-10 relevance pulls) while skipping the setup hell of Python/Docker alternatives. BM25 with query caching hits <100 microseconds, FSRS decay prunes stale facts automatically, and librarian expansion bridges semantic gaps—all in a 6MB zero-dep binary indexing 16k docs in 2 seconds. Developers dig the MCP standard for swapping agents without data loss.

Who should use this?

Local AI workflow builders using Claude Code, Cursor, or any MCP client who want a personal memory agent persisting preferences, project context, and decisions across sessions. Ideal for solo devs maintaining a personal memory curator of code snippets, auth patterns, or React hooks without cloud bills or personal GitHub token hassles.

Verdict

Grab it now if you need a lightweight personal memory assistant project—excellent README, one-command installs, and Khoj migration make it instantly useful despite 10 stars and 1.0% credibility score. Early maturity means test in non-prod first, but it's a smart bet for offline memory modus. (187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.