fajarhide

fajarhide / omni

Public

The Semantic Core for the Agentic AI can reduces 30–90% of token noise from CLI output. The MCP server distillation engine powered by Zig + Wasm.

10
1
89% credibility
Found Mar 19, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Zig
AI Summary

OMNI refines noisy tool outputs from commands like git diffs or docker builds into concise, high-value summaries for AI agents to improve reasoning efficiency.

How It Works

1
🔍 Discover OMNI

You hear about a helpful tool that cleans up messy computer reports so your AI helper can focus on the important stuff without wasting time.

2
📥 Get it set up

You run a simple one-line command to add OMNI to your computer, and it handles everything automatically.

3
🔗 Link to your AI

You tell your AI helper to use OMNI with a quick setup command, and now it knows to clean up outputs on its own.

4
Run your commands

Pipe any everyday task like checking code changes or building apps through OMNI, and instantly get short, smart summaries.

5
📊 See your savings

Check the dashboard anytime to watch how much time and effort OMNI is saving you with clear charts.

🎉 Smarter AI every day

Your AI helper now reasons faster with pure info, uses less resources, and helps you get things done quicker.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is omni?

Omni distills noisy CLI outputs from tools like git, docker, or npm into high-density signals for AI agents, cutting 30-90% token waste with zero semantic loss. Written in Zig with Wasm for under 1ms latency, it runs as an MCP server that Claude Code or Antigravity can query via tools like omni_execute or omni_read_file. Just pipe commands—omni -- docker build .—or use subcommands like distill and monitor for instant cleanup.

Why is it gaining traction?

It beats regex scrapers by semantically rewriting outputs based on confidence scores, delivering cleaner LLM context that boosts reasoning. Standout CLI features like density analysis, bench for throughput, and generate for agent configs make onboarding dead simple, while local metrics track your savings. In github omni tools searches and corespotlight semantic search, its MCP-first design hooks devs tired of bloated agent prompts.

Who should use this?

DevOps engineers feeding kubectl or terraform outputs to AI for triage, or full-stack devs prototyping Claude-powered code reviews where git diffs explode tokens. Ideal for Antigravity users seeking omni github integration, or anyone with agentic workflows hitting context limits on verbose builds. Skip if you're not piping shell to LLMs daily.

Verdict

Test it for MCP agents—solid token wins and Zig speed make it a smart proxy layer, especially with Homebrew taps and setup guides. At 10 stars and 0.9% credibility, it's pre-1.0 rawness shows in sparse tests, but the CLI shines for quick pilots. Promising for github omni guides followers.

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.