williamcr01

A plugin for OpenCode that displays a live Token Per Second (TPS) meter.

19
0
100% credibility
Found Apr 07, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A utility add-on for the OpenCode terminal AI interface that displays a real-time tokens-per-second meter during AI response streaming.

How It Works

1
🔍 Discover the speed meter

While using your AI chat tool in the terminal, you learn about a simple add-on that shows how fast the AI generates responses.

2
📥 Add the speed display

You easily add the speed meter feature to your AI chat setup with a quick one-line instruction.

3
🔄 Refresh your chat tool

You restart or update your terminal AI chat, and everything loads up smoothly.

4
📊 See the live speed counter

A small meter pops up in the corner of your screen, ready to show real-time response speed.

5
💬 Chat with AI as usual

You ask questions, and as the AI streams its answers, the meter starts tracking the pace.

6
Watch speed update live

The counter dances with numbers, showing tokens per second in a smooth, rolling display that feels magical.

Monitor AI speed effortlessly

Now every chat session lets you see exactly how fast your AI is working, making your experience more insightful and fun.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 19 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is opencode-tps?

opencode-tps is a TypeScript plugin for OpenCode that adds a live Tokens Per Second (TPS) meter to the terminal UI, tracking AI model generation speed during streaming responses. It shows the metric in the bottom-right corner, updating in real-time with a 5-second rolling window, and displays "-" when idle. Install it via open code install plugin CLI command like `opencode plugin @williamcr01/opencode-tps` or by adding it to your opencode plugin config in opencode.json, then npm install in ~/.opencode—requires OpenCode 1.3.14+ TUI only.

Why is it gaining traction?

Unlike generic OpenCode plugins, this delivers instant visual feedback on model throughput without extra tools, helping users spot slow providers mid-chat. The hook is its simplicity: no complex opencode plugin marketplace setup or langfuse integration needed, just plug-and-play for opencode tps monitoring in VSCode or IntelliJ-like terminal flows. Developers grab it for quick benchmarks, especially when tweaking opencode plugin simple memory or anthropic auth setups.

Who should use this?

AI prompt engineers using OpenCode TUI who benchmark models like Claude or GPT variants. Terminal-heavy devs debugging slow streams in long sessions, avoiding web UI limitations. Skip if you're on OpenCode web or need GitHub Copilot-style IDE plugins for IntelliJ/Eclipse/VSCode.

Verdict

Grab it if OpenCode TUI is your daily driver—19 stars and 1.0% credibility score signal early-stage but functional MIT-licensed code with solid README. Low maturity means test it yourself before production reliance.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.