Yuan-lab-LLM

A Kubernetes-first control plane for managing OpenClaw and Linux desktop runtimes at team and cluster scale.

381
54
100% credibility
Found Apr 01, 2026 at 381 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

ClawManager is a browser-based platform for teams to create, access, and govern shared virtual desktops including OpenClaw and Linux environments with usage tracking and AI controls.

How It Works

1
👥 Join a shared workspace team

You hear about ClawManager from your work team as an easy way to access personal virtual desktops without complicated setups.

2
🔐 Admin welcomes you

The team lead creates your account, sets your resource limits, and prepares ready-to-use desktop options.

3
🚀 Launch your desktop

Pick a desktop like OpenClaw or Ubuntu, choose your size, and click create – it spins up instantly in your browser.

4
🖥️ Work securely anywhere

Connect through a safe web portal to use your full desktop, chat with AI helpers, and save your work automatically.

5
📊 Team tracks everything

Admins watch resource use, AI chats, and costs while you focus on your tasks without interruptions.

🎉 Teamwork made simple

Your team collaborates on desktops with full control, backups, and smart AI – no servers to manage.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 381 to 381 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is ClawManager?

ClawManager is a Kubernetes-first control plane for managing OpenClaw and Linux desktop runtimes at team and cluster scale. It lets admins deploy virtual desktops—like OpenClaw agents, Ubuntu, or Webtop—via a web dashboard, handling user quotas, lifecycle ops, and secure browser access without exposing pods. Built with TypeScript React frontend and Go backend, it proxies traffic internally while adding an AI Gateway for governed LLM calls with audit trails and risk controls.

Why is it gaining traction?

It stands out by keeping desktops cluster-internal with token-based WebSocket proxying, avoiding VPN headaches, and bundling LLM governance—model routing, cost tracking, risk blocking—right into the desktop runtime. Quick K8s deploys via YAML manifests, CSV user imports, and multi-language docs (English, Chinese, Japanese, Korean, German) lower barriers for global teams. At 381 stars, it's pulling devs who want self-service VDI without vendor lock-in.

Who should use this?

DevOps running GPU-accelerated desktops for ML teams on K8s, needing quota enforcement and safe OpenClaw scaling. AI engineers wanting controlled LLM access inside virtual Linux envs, with audit logs for compliance. Small clusters testing team-scale remote dev workspaces before enterprise tools.

Verdict

Grab it if you're on K8s and need lightweight VDI with LLM smarts—docs are solid, MIT-licensed, one-command deploy works. 1.0% credibility score and modest stars signal early maturity; test in dev clusters first, watch for prod hardening.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.