haeli05

haeli05 / seaclaw

Public

OpenClaw rewritten in C. Single static binary, <2MB, <50ms cold start.

17
0
100% credibility
Found Feb 17, 2026 at 16 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
C
AI Summary

SeaClaw is a lightweight command-line and Telegram AI agent that converses with large language models, executes file and shell tools, and personalizes from workspace documents.

How It Works

1
🐾 Discover SeaClaw

You find SeaClaw, a tiny super-fast AI buddy that helps with chats, files, and tasks right on your computer.

2
📥 Place in your folder

Put the small program in your work folder alongside your personal notes like soul or agent descriptions.

3
🔗 Link AI thinking

Connect it to a smart AI service so your buddy can understand and reply to you.

4
💬 Start chatting

Open a conversation and ask questions – see instant, streaming replies that feel alive!

5
Pick your chat style
⌨️
Direct chat

Talk face-to-face in your terminal for quick back-and-forth.

📱
Message bot

Set up a bot to chat anytime via your messaging app.

6
🛠️ Let it help with tasks

Your buddy reads notes, handles files, runs simple commands, and remembers your chats.

🎉 Productivity boost

Now you have a smart, always-ready helper that makes your work faster and easier!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 16 to 17 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is seaclaw?

SeaClaw is OpenClaw rewritten in C—a lightweight AI agent that hooks into Anthropic Claude or OpenAI APIs for multi-turn chats with tool support like shell execution and file read/write. Drop it in a workspace with identity files (SOUL.md, AGENTS.md) to build a custom agent persona, then run via interactive CLI, one-shot queries, Telegram bot, or WebSocket gateway. Users get a single static binary under 2MB with cold starts below 50ms, perfect for quick local or server deploys without runtime bloat.

Why is it gaining traction?

It crushes the original Node.js OpenClaw on size (709KB vs 80MB), memory (2-5MB idle), and speed, slashing Docker images to 3MB while keeping streaming SSE responses and session persistence. The seaclaw anchor—vendored TLS, JSON, and SQLite for memory search—means zero deps and instant starts, hooking devs tired of npm hell for edge or embedded AI agents. Static C binary runs anywhere, from servers to constrained environments.

Who should use this?

Backend devs building Telegram bots or cron-scheduled AI tasks that need low-latency tool loops without container overhead. Solo makers prototyping Claude-powered CLI assistants in workspaces for code review or automation. Ops folks anchoring seaclaw as a sea claw antibiotic for bloated Node AI stacks in prod.

Verdict

Grab it if you need a featherweight C-based OpenClaw clone for fast, dep-free agents—18 stars and 1.0% credibility score signal early days with basic docs and no tests yet, but the <2MB binary and <50ms starts make it worth forking for real projects.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.