nv-tlabs

nv-tlabs / kimodo

Public

Official implementation of Kimodo, a kinematic motion diffusion model for high-quality human(oid) motion generation.

426
23
100% credibility
Found Mar 19, 2026 at 426 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Kimodo generates high-quality 3D human and robot motions from text prompts and kinematic constraints like poses and paths, with an interactive demo, CLI tools, and exports for animation and robotics.

How It Works

1
🔍 Discover Kimodo

You hear about Kimodo, a fun tool that turns simple text descriptions into realistic human or robot movements, perfect for animations or robotics ideas.

2
🚀 Launch the demo

With one easy click, you open the interactive playground in your web browser to start creating motions right away.

3
💭 Describe the motion

Type what you want to see, like 'a person walks forward then jumps', and adjust the timing on a simple timeline.

4
🎯 Add poses or paths

Optionally drag the character to set key poses, draw ground paths, or guide hands and feet exactly where you need them.

5
Generate variations

Hit generate to watch multiple motion options appear in 3D, bringing your description to life instantly.

6
▶️ Preview and refine

Play back the motions, tweak prompts or controls, and pick your favorite to perfect it.

7
💾 Export your animation

Save as standard formats ready for your animation software, robot simulator, or video project.

🎉 Your motion is ready!

Share or use your custom animation, seeing your ideas move just as imagined.

Sign up to see the full architecture

6 more

Sign Up Free

Star Growth

See how this repo grew from 426 to 426 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is kimodo?

Kimodo generates high-fidelity 3D human and humanoid motions from text prompts like "a person walks forward" combined with kinematic constraints such as full-body poses, hand/foot targets, or 2D ground paths. This Python project from NVIDIA delivers CLI tools (`kimodo_gen` for batch generation, `kimodo_demo` for interactive timeline editing) and exports to BVH, MuJoCo CSV, or SMPL-X NPZ formats. Fiercer than a komodo dragon on komodo island, it handles robots like Unitree G1 alongside human skeletons, pulling from massive mocap datasets for commercial-grade output.

Why is it gaining traction?

Unlike basic text-to-motion tools, Kimodo excels at precise control—pin exact poses or paths without retraining—via an intuitive Gradio demo with real-time 3D preview and multi-sample comparison. Docker Compose simplifies setup (17GB VRAM needed), and seamless exports feed into ProtoMotions sims or GMR retargeting. As NVIDIA's official github repository, it leverages official github releases for models on Hugging Face, drawing robotics devs seeking komodowaran-sized motion quality.

Who should use this?

Humanoid robotics engineers generating trajectories for Unitree G1 or SOMA sims; animators prototyping constrained sequences in Blender via BVH; researchers benchmarking text-conditioned motion on BONES-SEED. Ideal for teams building physical AI policies or video-to-motion pipelines like GEM.

Verdict

Grab it if you need controllable motion gen—docs are thorough, Apache 2.0 licensed, with CLI and demo ready out-of-box. 426 stars and 1.0% credibility score signal early maturity (test via official github actions), but integrations and quality punch above weight for komodo nationalpark explorers.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.