vancelin

vancelin / openmemory

Public

Share long-term memory across all your AI agents — no manual start/stop needed.

42
14
80% credibility
Found Apr 01, 2026 at 42 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Shell
AI Summary

Shell extension that automatically launches and manages a local memory service to provide persistent context for AI command-line chat tools.

How It Works

1
🔍 Discover OpenMemory

You hear about this helpful tool that gives your AI chat commands a memory to remember past conversations.

2
📥 Grab the setup files

You download the simple installer from GitHub to your computer.

3
🛠️ Run the installer

You double-click the install script, and it quietly adds smart starters to your everyday chat setup.

4
📦 Prepare the memory brain

You follow easy steps to download and ready the main memory service in your personal folder.

5
🔄 Refresh your chat window

You close and reopen your terminal, and everything springs to life behind the scenes.

6
💬 Start chatting with AI

You type a friendly command like 'claude' or 'codex', and the memory service wakes up automatically to join the conversation.

🧠 Smarter ongoing chats

Now your AI buddy remembers everything from before, making talks feel continuous and helpful across sessions.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 42 to 42 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is openmemory?

OpenMemory hooks into your shell to automatically start and stop a shared long-term memory server for AI agents, eliminating manual docker runs or server babysitting. Run claude, codex, or gemini CLIs, and it manages an MCP endpoint transparently via shell wrappers and a Python proxy, built in Shell with Node.js backend support. Developers get persistent memory across sessions—like an open memory box for cursor open memory mcp or github openmemory mcp—without config hassles.

Why is it gaining traction?

It stands out by using reference counting to share one openmemory mcp server across multiple terminals, auto-killing when idle, unlike mem0 openmemory github setups needing constant oversight. The hook is seamless integration with tools like openmemory github copilot or openmemory copilot chat, plus easy install for openmemory docker or openmemory ui flows. No more forgetting to start the openmemory mcp server github instance mid-coding sprint.

Who should use this?

AI power users chaining claude or gemini with GitHub Copilot, especially those sharing github copilot account across teams or debugging via share github codespace. Ideal for devs building agents that need shared memory, like sharing github actions within your organization or quick share github repo via link prototypes. Skip if you're not on zsh/bash or avoid Node.js deps.

Verdict

Early but clever—42 stars and 0.800000011920929% credibility score signal prototype maturity with solid README guidance, though light on tests. Try for openmemory chrome extension vibes in CLI; pair with open memory box archiv for production if it sticks. Worth a shell install for frequent AI CLI folks.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.