RNA4219

RNA4219 / memx-core

Public

Local-first personal memory & knowledge store for LLM agents. ローカルLLM向けの個人用メモリ/知識ストア(short / chronicle / memopedia / archive の4層構成)。

20
0
100% credibility
Found Mar 08, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

memx-core provides a lightweight local memory system for AI agents using simple file-based stores for notes, logs, knowledge, and archives with easy search and AI integration.

How It Works

1
💡 Discover memx-core

You find a simple tool that gives your local AI a reliable memory, like notes for conversations and knowledge.

2
📥 Get the tool ready

Download and launch the easy command-line helper that works with your computer's files.

3
📝 Save your first memories

Quickly jot down short notes, daily progress logs, or key facts using simple commands.

4
🔍 Search and remember everything

Ask for any note across all your memories and get instant matches with summaries.

5
🤖 Connect your AI helper

Link it to your AI so it can automatically summarize and tag notes for you.

6
🧹 Keep things tidy

Run cleanup to archive old notes safely, keeping your memory fresh and organized.

🎉 Your AI remembers forever

Now your assistant recalls conversations, progress, and knowledge perfectly every time.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 20 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is memx-core?

memx-core is a local-first personal memory and knowledge store built in Go for LLM agents running offline. It organizes data across four layers—short-term notes, chronicle-style journals, memopedia-like knowledge base, and long-term archive—for fast access and durable storage. Developers get a CLI tool (`mem in/out`) and HTTP API to ingest, search, recall, summarize, and resolve typed references across stores without cloud dependency.

Why is it gaining traction?

Its local-first design keeps sensitive agent data private and fast, dodging API costs and latency of remote vector DBs. Multi-layer retrieval (e.g., quick short searches fallback to archive) plus LLM integration for auto-summaries make agent context efficient. Go's speed ensures sub-100ms queries even on modest hardware, per built-in benchmarks.

Who should use this?

LLM agent builders needing offline, personal knowledge stores for RAG pipelines. Devs prototyping local-first AI assistants or personal CRMs with Go backends. Teams handling chronicle data like dev journals or memopedia glossaries without vendor lock-in.

Verdict

Promising core for local-first agent memory, but 1.0% credibility score and 16 stars signal early-stage: solid tests and perf metrics, but docs are sparse. Try for prototypes; watch for ecosystem growth before production. (198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.