rustystack

rustystack / edgehdf5

Public

HDF5-backed agent memory for on-device AI

26
3
100% credibility
Found Feb 24, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

A Rust library for creating portable, single-file memory stores that enable fast on-device search for AI agent conversations, embeddings, and knowledge graphs.

How It Works

1
🔍 Discover EdgeHDF5

You hear about a simple way to give your AI assistant a brain that remembers conversations forever on any device.

2
📦 Add it easily

You bring this memory tool into your project with one quick step, no hassle.

3
🗄️ Set up your memory file

You create a single file where your AI can store chats, facts, and learnings.

4
Save and search instantly

You add new memories from chats and find the best matches in a blink, super fast even on your laptop.

5
🧠 Build smarter connections

You link people, projects, and ideas so your AI understands relationships better.

🎉 Your AI remembers everything

Now your assistant recalls past talks perfectly, works offline anywhere, and feels truly smart.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 26 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is edgehdf5?

EdgeHDF5 is a Rust library delivering HDF5-backed memory for on-device AI agents. It packs conversations, embeddings, knowledge graphs, and session history into a single portable .h5 file, enabling microsecond-latency searches with adaptive backends like SIMD, BLAS, or GPU. Developers get a framework-agnostic drop-in for persistent agent memory without running databases or networks.

Why is it gaining traction?

It crushes SQLite or Qdrant alternatives for edge use: zero daemons, direct mmap I/O on float arrays, and 8x compression via product quantization. Hardware dispatch auto-picks the fastest path—AMX on Apple Silicon hits 157µs for 10K vectors—while migration CLI ports existing SQLite agent DBs effortlessly. Single-file snapshots make it dead simple to ship or inspect.

Who should use this?

Rust devs crafting local AI agents on laptops, edge devices, or CI runners. Ideal for custom loops needing embedding storage and hybrid vector+BM25 search without cloud vector DBs. Skip if you're deep in Python ecosystems or scale to millions of vectors.

Verdict

Grab it for on-device agent memory in Rust—benchmarks and quickstart docs impress despite 17 stars and 1.0% credibility signaling early days. Test with the CLI on your SQLite dump; pair with Accelerate features on Mac for instant wins, but watch for multi-thread gaps.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.