jillesme

Save 50% on tokens 😱

12
1
100% credibility
Found Apr 24, 2026 at 12 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A demo web app for submitting user feedback that an AI classifies into categories, priorities, and summaries, with options for fast individual processing or cheaper batched processing.

How It Works

1
🌐 Discover the demo

You find a simple web page that lets you test classifying customer feedback with smart AI.

2
📝 Fill out the form

Choose where the feedback came from, type the message, and pick fast or money-saving mode.

3
Pick your processing style
🚀
Quick mode

Get results right away in seconds for instant insights.

💰
Group mode

Save money by bundling with others, results come in minutes.

4
📤 Submit and track

Send it off and watch the status update live on the page.

5
🔍 View the analysis

See the AI's smart breakdown: category like bug or praise, priority, and a short summary.

🎉 Feedback organized

Your customer messages are now neatly sorted and actionable for your team.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 12 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is cloudflare-queue-batch-api?

This TypeScript Cloudflare Worker receives user feedback via POST /feedback, queues it, and uses Anthropic's Claude to classify it into categories like bug or praise, with priority and summary. Batch mode submits jobs to Anthropic's discounted Message Batches API for 50% off token costs, while sync mode delivers results in seconds at full price—results land in D1 for querying via GET /api/feedback/:id. Toggle modes in the built-in demo form to test both paths instantly.

Why is it gaining traction?

It bundles queues for retries and dead-letter handling with Anthropic batching, letting you save 50% on tokens without managing polling or errors yourself—ideal for high-volume inference where you could save 50 dollars a week or 50 000 in a year on API spend. The cron-polled async flow keeps Workers lean, and the sync fallback avoids latency woes for demos or real-time needs. Developers grab it for the plug-and-play cost cut on feedback or support ticket routing.

Who should use this?

Cloudflare devs building SaaS apps that classify user inputs, like support teams auto-tagging tickets from web_app or app_store sources. Product managers analyzing feedback streams without burning LLM budgets. Early adopters tweaking it for custom prompts on sales_call transcripts or billing queries.

Verdict

Grab it if you're on Cloudflare and batching Anthropic calls—solid docs and local dev make setup fast, despite 12 stars and 1.0% credibility signaling early-stage maturity. Fork and scale for production; test coverage is light, but the demo proves it works out of the box.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.