castnettech

State aware knowledge compression, ingestion, and hybrid retrieval engine. Zero dependencies. Sub-100ms queries.

42
9
100% credibility
Found Apr 04, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Mnemosyne is a local tool that indexes codebases for fast natural-language search and optimized context for AI assistants.

How It Works

1
🛠️ Install Mnemosyne

Download this helpful code helper with one easy command on your computer.

2
🏠 Prepare your project

Go to your code folder and set up a special learning spot for Mnemosyne with a simple start command.

3
📚 Feed it your code

Tell Mnemosyne to read and remember all your project files so it understands everything.

4
🔍 Ask questions naturally

Type everyday questions like 'How does login work?' and instantly see the right code pieces.

🎉 Codebase at your fingertips

Now searching your code feels effortless, saving you time and frustration every day.

Sign up to see the full architecture

3 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 42 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is mnemosyne?

Mnemosyne indexes your codebase into a local SQLite store using Python with zero runtime dependencies, delivering hybrid retrieval via BM25, TF-IDF, symbol matching, and usage frequency for sub-100ms queries. It compresses results AST-aware to slash token counts by 40-70%, perfect for feeding precise context to LLMs without cloud APIs. Like the Mnemosyne göttin of memory or a mnemosyne notebook for code, it handles state-aware knowledge compression across 7 languages from Python to Rust.

Why is it gaining traction?

Zero deps mean instant setup—no conflicts—and daemon mode keeps indexes warm for instant hits, beating grep or ripgrep on semantic queries like "how does auth work?" in large repos. Token budgeting ranks by value-per-token, cutting LLM costs 70%+, with delta diffs for PRs and MCP server for seamless Claude Code integration. In the github state of the octoverse 2024 trends, it tackles exploding codebase sizes and state spaces like mamba models with state-aware retrieval.

Who should use this?

Backend devs querying legacy monoliths for migrations or security audits, like spotting eval() patterns with state aware anomaly detection. On-call engineers hunting "payment retry logic" at 3am, or teams onboarding via natural language over codebases. AI agents in Cursor/Aider needing github state file diffs without full re-reads.

Verdict

Worth pip installing for local LLM context optimization if you're on Python 3.11+—CLI is polished, docs solid with benchmarks. At 14 stars and 1.0% credibility, it's early but stable (AGPL); test on your repo before production. Solid for state aware fuzzing or firewall config hunts.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.