ColinLu50

OpenClaw 飞书/Lark Channel 插件修改,模型实现流式输出,并给出模型当前状态,以便监控。

18
3
100% credibility
Found Mar 28, 2026 at 18 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

Enhances OpenClaw AI agents with real-time streaming cards, execution traces, and metrics in Lark/Feishu group chats.

How It Works

1
💬 Chat with friends in Feishu

You and your team use Feishu for work talks and want an AI helper that responds live.

2
📦 Add the streaming helper

Run a simple install command to bring real-time AI replies to your Feishu chats.

3
⚙️ Turn on live streaming

Flip a switch in settings so AI thoughts and answers appear as they form.

4
See magic in action

Type a question in group chat – watch the AI think step-by-step, call helpers, and stream the full answer right there.

5
📊 Check progress at a glance

Footer shows time taken, smart usage, and status so you know exactly what's happening.

🎉 Team AI chats feel alive

Your Feishu group now has a super-smart assistant that responds instantly with full transparency.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 18 to 18 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is openclaw-lark-stream?

This TypeScript fork of the official OpenClaw Lark/Feishu channel plugin adds real-time streaming to interactive cards, so model outputs appear progressively instead of dumping all at once. It works in group chats too, displaying live agent reasoning, tool call indicators, and a footer with token counts, context usage, and elapsed time—all toggleable via config. Install via `npx @colinlu50/openclaw-lark-stream install` and tweak with `openclaw config set channels.feishu.streaming true`.

Why is it gaining traction?

Unlike the base openclaw-lark plugin, it brings group streaming and execution traces (reasoning blocks, tool timelines) that let you monitor agents without waiting for completion. The compact footer stats (tokens, cache hits) give instant observability, and it auto-detects OpenClaw versions for compatibility. Devs grab it from the openclaw github repo for seamless openclaw github integration into Lark channels.

Who should use this?

OpenClaw users on Feishu/Lark teams needing live LLM responses in group workflows, like AI agents handling docs or calendars. Ideal for ops folks monitoring token burn or devs debugging tool chains in shared chats—skip if you stick to static replies.

Verdict

Grab this from openclaw github releases if streaming visibility hooks you; 18 openclaw github stars and 1.0% credibility score signal early maturity with thin docs, but solid TypeScript and MIT license make it a low-risk openclaw github download for Lark stream experiments. Test in a side project first.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.