mpecan

mpecan / tokf

Public

Config-driven CLI tool that compresses command output before it reaches an LLM context

93
8
100% credibility
Found Feb 20, 2026 at 10 stars 9x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

tokf filters verbose outputs from developer tools like Git and Cargo into concise summaries to reduce token usage in AI language model contexts.

How It Works

1
🔍 Discover tokf

You hear about a handy helper that turns messy command results into short, clear notes so your AI buddy understands faster without wasting space.

2
📥 Get it set up

Pick the easy way to add it to your computer, like a quick homebrew sip or cargo grab, and it's ready in seconds.

3
Magic first try

Run your usual build or push command, and watch walls of text shrink to one smart line—like '✓ 47 tests passed'—saving tons of room.

4
🔗 Link to your AI

Add a simple hook so every command your AI runs gets cleaned up automatically, no extra steps needed.

5
📊 Check your wins

Peek at the dashboard to see tokens saved—like 80% less clutter—making chats quicker and cheaper.

🎉 Smooth AI coding

Now your AI sees clean info, thinks sharper, and your projects fly with less noise and more speed.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 93 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is tokf?

tokf is a Rust CLI that intercepts verbose output from commands like git push, cargo test, or docker build, then compresses it via config-driven TOML filters before it reaches your LLM context. It slashes token waste from progress bars and boilerplate, turning 60-line cargo test logs into "✓ 47 passed (2.31s)". Run `tokf run git push` or install hooks for seamless Claude Code integration.

Why is it gaining traction?

Unlike generic proxies, tokf ships a battle-tested filter library for 50+ dev tools, with declarative TOML configs anyone can tweak or share on GitHub. Token savings tracking via `tokf gain` quantifies your wins, and built-in test suites ensure filters reliably compress output without breaking. The Claude hook auto-filters every bash tool call, saving context in AI coding workflows.

Who should use this?

DevOps engineers wrangling docker compose logs, Rust or JS devs feeding cargo/pnpm output to LLMs, or anyone building AI agents that run CLI commands. Ideal for Claude Code users hitting context limits on git status or pytest runs.

Verdict

Grab it if you're prototyping LLM agents—install via cargo or brew, tweak filters, and track gains immediately. At 10 stars and 1.0% credibility, it's early but docs shine and tests cover every filter; productionize with caution until adoption grows.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.