kandada

kandada / fastmind

Public

A lightweight, event-driven multi-agent framework for embodied AI systems. 轻量级事件驱动的具身智能多 Agent 系统框架。

19
6
100% credibility
Found Mar 29, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

FastMind is a lightweight Python framework for building event-driven multi-agent AI systems with graph-based workflows, supporting streaming outputs, human approvals, tool calls, and sensor perceptions.

How It Works

1
🔍 Discover FastMind

You hear about FastMind, a simple way to build smart helpers like chat buddies or robot brains that react to the world.

2
📥 Set it up

You grab it for your computer in one easy step, and everything is ready to go.

3
🧠 Build your first helper

You describe a chatting friend in a short note, like writing a recipe, and connect it to think with AI.

4
🔗 Add smart actions

You link everyday tools like weather checks or decision approvals so your helper can do more on its own.

5
⏱️ Handle real-world inputs

You set up watchers for timers or sensors, letting your helper notice changes and respond instantly.

6
💬 Chat and watch it work

You start a conversation or send a signal, and see responses stream in real time, feeling alive and smooth.

7
🔄 Pause for your input

When needed, it asks you for approval on big steps, and you guide it safely forward.

🎉 Your smart system shines

Now you have a reliable AI companion handling chats, decisions, or even guiding robots effortlessly.

Sign up to see the full architecture

6 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 19 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is fastmind?

Fastmind is a Python framework for building event-driven multi-agent systems targeted at embodied AI, like robots or drones. It lets you wire up agents, tools, and sensor perceptions into flowchart-style graphs using FastAPI-like decorators, handling async execution without polling or nested loops. Developers get session-isolated workflows with built-in streaming output, human-in-the-loop interrupts, and ReAct-style tool calls—perfect for real-time apps on lightweight GitHub setups.

Why is it gaining traction?

Its hook is the dead-simple API: define agents with `@app.agent`, perceptions with timers or sensors via `@app.perception`, and stream events zero-overhead via asyncio queues. No heavy dependencies, just ~8000 lines packing high-perf features like backpressure-controlled streaming and multi-user isolation that rivals bulkier agent frameworks. Early adopters praise the runnable examples for chatbots, drones, and humanoid robots, making prototypes fast without the LangGraph learning curve.

Who should use this?

AI engineers prototyping embodied agents for drones, companion bots, or humanoid robots needing sensor loops and tool integration. LLM backend devs building ReAct loops with human approval gates, like sleep assessment or order processing flows. Robotics teams seeking a lightweight GitHub alternative to verbose agent platforms for quick sensor-driven multi-agent coordination.

Verdict

At 19 stars and 1.0% credibility, fastmind is alpha-fresh with solid docs, examples, and tests—but expect bugs and GPL-3.0 copyleft. Grab it for lightweight agent experiments if you're prototyping fastmind foundation-style embodied AI; skip for production until more battle scars.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.