chu2bard

Token counter and LLM cost estimator CLI

20
0
100% credibility
Found Feb 11, 2026 at 20 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

Tokenmeter is a utility that counts text units for AI language models and estimates processing costs across OpenAI, Anthropic, and Google providers.

How It Works

1
🧐 Discover the need

You have some writing or a document and wonder how much space it takes in AI chats and what it might cost to process.

2
📥 Pick up the tool

You grab this handy little calculator app for AI text and get it ready on your computer in moments.

3
📝 Choose your text

Pick a message, note, or open a file with the words you want to measure.

4
🔢 Count the pieces

Run a quick check to see how many small text chunks your writing breaks into for AI use.

5
💰 Estimate the bill

Enter the chunk numbers to instantly see cost guesses for popular AI services.

6
📋 Browse choices

Peek at the full list of AI options it knows about to compare.

🎉 Budget like a pro

Now you confidently plan your AI projects knowing exact sizes and costs ahead of time!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 20 to 20 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is tokenmeter?

Tokenmeter is a TypeScript CLI and library that counts tokens in text or files for LLM models like GPT-4o, Claude, and Gemini, doubling as a token counter openai, token counter llama, or token counter llm. It solves the hassle of estimating prompt sizes and API bills without firing up ChatGPT or DeepSeek—feed it a PDF, README, or raw string via `node dist/cli.js count --file doc.pdf --model gpt-4o`, get tokens/words/chars/lines back. Bonus: `cost` command crunches USD estimates across providers using real pricing.

Why is it gaining traction?

Unlike browser-based token counter online tools, this runs locally as a token meter CLI with JSON output and a simple JS API for scripts: `countTokens(text, 'gpt-4o')` or `estimateCost(10000, 2000, 'claude-3-sonnet')`. It lists all supported models instantly, filters by provider/model, and falls back to rough estimates if needed—no install walls. Devs grab it for quick batch checks on repos or prompts, beating manual char/4 math.

Who should use this?

AI engineers tuning long-context prompts in production apps, indie devs prototyping chatbots who track OpenAI spends, or script kiddies batching token counts on PDF datasets before LLM calls. Perfect for VSCode/GitHub Copilot users scripting `token github repo` scans or CI pipelines estimating costs pre-deploy.

Verdict

At 19 stars and 1.0% credibility, tokenmeter is raw—solid for CLI token counter chatgpt basics but lacks polish like tests or full model coverage; docs are minimal, install needs tweaks. Try it for lightweight LLM budgeting if you're in TypeScript land, but monitor for maturity before prime time.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.