oliviazzzu / minimal-embodiment
PublicA minimal hardware-software architecture giving large language models a closed-loop physical embodiment with self-perception loops.
This project provides software to connect AI models to a simple physical device with sensors for environment, light, motion, and sound, plus outputs for haptic feedback, a display face, and buzzing, including loops for self-perception.
How It Works
You come across this exciting idea of giving an AI a simple real-world body that can sense its surroundings and even perceive its own actions.
You follow easy instructions to get the connection software running on your laptop, creating a helpful bridge for the AI to talk to the body.
You put together a small device using an ESP32 board, adding sensors to feel temperature, light, motion, sound, and parts to tap, show faces on a screen, and buzz.
You link the device to your home WiFi and pair it with your computer software so it can share what it senses and receive instructions.
You start the system and watch as the device comes alive, ready to sense the room and respond with taps, faces, and sounds.
You try buzzing or tapping, then check if the device correctly detects its own sounds and vibrations – proving it perceives itself.
Hook it up to your AI assistant, and now it experiences the real world through senses, acts back, and even feels and hears what it does.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.