hku-mars

hku-mars / UMI-3D

Public

Part of the UMI-3D project: https://umi-3d.github.io/

19
1
100% credibility
Found Apr 17, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A complete pipeline that transforms raw robot sensor recordings into ready-to-use datasets for training AI manipulation policies.

How It Works

1
🔍 Discover UMI-3D

You find the UMI-3D project online and see it helps turn robot videos into smart training lessons.

2
🔧 Build your robot

Follow simple guides to assemble affordable hardware with cameras and sensors for data capture.

3
📹 Record robot demos

Guide your robot through tasks like picking objects, saving short video clips and motion logs.

4
⚙️ Tune the sensors

Quickly align cameras and scanners using printed patterns so everything sees the world correctly.

5
🗺️ Map the play area

Run a mapping tool to track your robot's path and build a 3D view of the workspace.

6
📦 Prepare training data

Blend videos, paths, and motions into neat packages ready for teaching robot skills.

🤖 Teach robot new tricks

Feed the data into learning tools and watch your robot master tasks smoothly.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 19 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is UMI-3D?

UMI-3D turns raw rosbag recordings from LiDAR, IMU, and fisheye cameras into training-ready Zarr datasets for robot policy learning, like diffusion models. It handles sensor calibration, LiDAR-inertial SLAM for 3D trajectories, multi-modal alignment, and ArUco-based gripper calibration in a Python pipeline. Check the project at https://umi-3d.github.io/ for the full UMI-3D ecosystem, including hardware and policies.

Why is it gaining traction?

Unlike basic data loaders, it delivers end-to-end 3D spatial perception for manipulation tasks, syncing noisy real-world sensors into precise, policy-trainable formats without manual tweaking. Developers get SLAM-estimated camera paths and aligned videos out of the box, slashing preprocessing time for umi 3d print or umi rental 3d model workflows. The conda setup and scripted pipeline make it dead simple to run on Ubuntu with ROS Noetic.

Who should use this?

Robotics engineers collecting demos on UMI hardware for tasks like lifting or tool hanging, needing 3D-aware datasets fast. Ideal for researchers extending universal manipulation interfaces with SLAM, or teams training part-aware transformers on real trajectories.

Verdict

Solid for UMI-3D users despite 19 stars and 1.0% credibility—docs are thorough, but low adoption signals early maturity; test on small rosbags first. Worth forking if you're in embodied AI.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.