win4r

Fork of Martian-Engineering/lossless-claw with CJK-aware token estimation

149
29
100% credibility
Found Mar 30, 2026 at 183 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

Enhanced plugin for OpenClaw that losslessly manages long conversation context via smart summarization trees, with fixes for multilingual text and production reliability.

How It Works

1
🔍 Discover smarter chats

You find a video showing how this add-on keeps AI conversations sharp even after hours of talking.

2
📥 Add to your AI helper

Download and add the add-on to your OpenClaw setup with a simple command.

3
⚙️ Tune it up

Tell it how much recent chat to keep fresh and pick a quick AI for summaries.

4
💬 Start endless talks

Chat away in OpenClaw—old details get smartly tucked into summaries automatically.

5
🔎 Recall anything

Ask tools to search history or expand summaries, pulling back exactly what you need.

🎉 Perfect memory

Your AI never forgets, handling huge chats smoothly forever.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 183 to 149 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is lossless-claw-enhanced?

This TypeScript fork github repo of martian-engineering/lossless-claw delivers a lossless-claw-enhanced plugin for OpenClaw, swapping basic sliding-window truncation for DAG-based summarization that stores every message in SQLite while fitting active context into model token limits. Developers get CJK-aware token estimation to avoid overflows in Chinese, Japanese, or Korean chats, plus tools like lcm_grep for searching history and lcm_expand_query for sub-agent recall. Install via openclaw plugins install --link for instant updates, configure contextThreshold and summaryModel, then restart the gateway.

Why is it gaining traction?

The enhanced fork fixes upstream token estimation that butchered CJK text by 2-4x, plus cherry-picks production bug fixes like session rotation detection and auth false-positives—making it reliable for real workloads where the original falters. Video tutorials on YouTube and Bilibili demo hybrid retrieval and config, hooking OpenClaw users tired of lost context. Git pull upstream/main keeps it synced as a fork github project.

Who should use this?

OpenClaw gateway operators running long multilingual agent sessions, especially CJK-heavy support bots or research threads exceeding 128k tokens. Teams forking github repo to gitlab or bitbucket for custom tweaks, needing precise recall without truncation artifacts.

Verdict

Grab this fork github download if you're on OpenClaw and hit context limits—solid for production with CJK support and easy upstream merges. 1.0% credibility score and 149 stars signal early maturity; docs shine but expect light test coverage outside estimation.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.