jxoesneon

A high-performance, local, offline-first AI memory system built in Rust

10
1
100% credibility
Found Apr 10, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

A local, offline AI memory tool that organizes codebases and conversations into a searchable palace for AI agents to retain long-term context.

How It Works

1
🔍 Discover MemPalace

You hear about a simple way to give your AI chats a real memory, so it remembers your projects and talks forever.

2
📥 Get it ready

Download the tool and launch it on your computer with one easy step.

3
📁 Point to your stuff

Tell it which folders hold your work projects or chat logs to build your memory palace.

4
💾 Fill the palace

Watch as it scans and organizes everything into a smart, searchable memory home.

5
🔗 Connect your AI

Link it to your favorite AI chat app so your assistant can pull memories instantly.

6
🧠 Ask away

Chat with your AI and see it recall details from weeks ago perfectly.

AI remembers everything

Your AI now acts like it has a brain, never forgetting your world, making work feel magical.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is mempalace-rs?

mempalace-rs builds a high-performance, local, offline-first AI memory system in Rust, mining codebases and conversations into a structured, searchable "palace" for long-term agent retention. It layers identity, recency, on-demand similarity search, and raw semantic retrieval, compressing data ~30x with AAAK while tracking entity relationships temporally. Users run CLI commands like `mine /path/to/project`, `search "async patterns"`, or `mcp-server` for instant AI integration.

Why is it gaining traction?

Rust crushes the Python original with 10x speedups—file mining drops from 2min to 15s—while benchmarks hit perfect 2026 Gold Standards for reasoning and 1M+ token persistence. MCP exposes 20 tools for Claude/Cursor/Windsurf, enabling offline vector search without network lag. At 7.9MB binary and 50MB RAM, it's a lightweight high performance local LLM enabler.

Who should use this?

AI agent builders needing persistent memory for high performance local LLMs. Rust developers mining personal repos for semantic queries during long sessions. Teams ditching cloud RAG for offline-first storage in high performance backend github workflows.

Verdict

Promising v0.4 with 197 passing tests and strong docs, but 10 stars and 1.0% credibility signal early maturity—prototype for local experiments, watch for stability. Cargo install if offline AI memory fits your stack.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.