adoresever

Openclaw记忆插件Knowledge Graph + Memory;Knowledge Graph Context Engine for OpenClaw — extracts structured triples from conversations, compresses context 75%, enables cross-session experience reuse

27
1
100% credibility
Found Mar 11, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A plugin for OpenClaw AI agents that creates a structured memory graph from conversations to compress context, recall cross-session knowledge, and connect related skills and fixes.

How It Works

1
🔍 Discover Graph Memory

You learn about a helpful upgrade that makes your AI chat partner remember past fixes, skills, and lessons across all conversations.

2
📦 Add to Your Setup

You easily install the memory booster into your AI assistant's home with a single action.

3
🔗 Connect Thinking Power

You link a smart thinking service so your agent can build and use its memory.

4
🚀 Switch It On

You turn on the memory feature, and your agent is instantly ready to learn from every chat.

5
💬 Chat Away

You dive into long talks and tasks with your agent, letting it handle errors and steps naturally.

6
🧠 Watch It Remember

Your agent pulls up relevant past solutions automatically, keeping conversations quick and sharp even after many exchanges.

7
📝 Find or Save Knowledge

You use easy search or save commands to dig up old wisdom or capture new insights on the fly.

🎉 Smarter Companion

Your AI feels alive with experience, solving problems faster and staying on track forever.

Sign up to see the full architecture

6 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 27 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is graph-memory?

Graph-memory is a TypeScript plugin for OpenClaw that builds a knowledge graph from AI agent conversations, extracting structured triples for tasks, skills, and events. It compresses context by 75% by swapping raw chat history for graph nodes and edges, while enabling cross-session reuse of experiences like fixed bugs or workflows. Install via pnpm, configure in openclaw.json with LLM/embedding keys, and it handles ingestion, recall, and agent tools like gm_search or gm_record.

Why is it gaining traction?

Unlike summary-based tools, this graph memory for AI agents connects errors to solutions via typed edges (SOLVED_BY, REQUIRES), with personalized ranking that adapts to queries and vector search for semantic matches. Developers notice stabilized token usage in long convos—95K drops to 24K—and automatic cross-session recall without manual MEMORY.md dumps. Background compaction and maintenance keep the graph clean without blocking chats.

Who should use this?

OpenClaw users building persistent AI agents for devops tasks, like installing graph memory MCP servers or debugging n8n workflows, where repeating fixes wastes tokens. AI devs needing agent graph memory that links skills across sessions, beyond simple RAG or chat history.

Verdict

Solid for OpenClaw tinkerers—excellent docs, 53 tests, MIT license—but at 16 stars and 1.0% credibility, it's early and niche. Try if you're in the ecosystem; skip for broader LLM setups until more adoption.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.