chu2bard

Server-sent events streaming library for LLM responses

16
0
100% credibility
Found Feb 11, 2026 at 16 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

Eventpipe is a library that processes and normalizes streaming responses from AI services like OpenAI, Anthropic, and Google into a unified format for handling text tokens, tool calls, and completion events.

How It Works

1
🔍 Discover Eventpipe

You learn about a simple tool that makes AI responses stream in like real-time typing for smoother chats.

2
📦 Bring it home

You easily add this tool to your own project to handle live AI updates.

3
🔗 Connect your AI friends

You link it up with popular AI services so they can send replies as they think.

4
Watch the magic unfold

You send a question and see words appear one by one, feeling natural and alive.

5
🔄 Switch AI services effortlessly

It works seamlessly with different AI providers without any extra setup.

6
Catch the finish

You get a clear signal when the AI is done, wrapping up the full response.

🎉 Enjoy flowing conversations

Your creation now delivers engaging, real-time AI chats that feel human.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 16 to 16 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is eventpipe?

Eventpipe is a TypeScript library for streaming server-sent events (SSE) from LLM APIs like OpenAI, Anthropic, and Google. It takes a fetch response, parses the raw SSE stream, and normalizes tokens, tool calls, or completion signals into simple callbacks—onToken for live text, onDone for full output, onError for issues. No more manual SSE parsing headaches in your Node.js or browser apps.

Why is it gaining traction?

Auto-detects providers from response format, so one API handles all without config tweaks, unlike custom server-sent events nextjs or react boilerplate. It shines for server-sent events fastapi, spring boot, or even c# backends piping LLM responses, sidestepping server-sent events vs websockets complexity for pure streaming. Devs love the drop-in processStream that buffers and yields clean events on the fly.

Who should use this?

TypeScript devs building real-time AI UIs in Next.js or React, needing live LLM token streaming without provider-specific hacks. Backend engineers in Node.js integrating server-sent events python/FastAPI proxies or server-sent events sse endpoints for chat apps. Skip if you're deep into events or non-LLM SSE.

Verdict

Eventpipe delivers quick wins for LLM SSE normalization, but 16 stars and 1.0% credibility score signal early immaturity—basic docs, no tests. Prototype with it now; monitor eventpipe reviews for production stability.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.