BAAI-Humanoid

MOSAIC: Bridging the Sim-to-Real Gap in Generalist Humanoid Motion Tracking and Teleoperation with Rapid Residual Adaptation

65
3
100% credibility
Found Feb 20, 2026 at 46 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

MOSAIC is an open-source system for training humanoid robots to accurately track and imitate diverse human motions in simulation for teleoperation.

How It Works

1
๐Ÿ‘€ Discover MOSAIC

You find this cool project that teaches robots to copy human movements like dancing or walking.

2
๐Ÿ› ๏ธ Set up robot playground

Download the free simulator to create a safe virtual world for your robot to practice.

3
๐Ÿ“น Gather movement examples

Collect videos or data of people moving naturally to give your robot real-life examples to follow.

4
๐Ÿš€ Train your robot

Hit start and watch as your robot learns step-by-step to mimic those smooth human motions perfectly.

5
๐ŸŽฎ Test and tweak

Play back motions in the simulator, see your robot follow along, and fine-tune for better results.

6
๐Ÿ”„ Adapt for real robots

Transfer the skills to actual hardware like G1 or H1 robots for real-world teleoperation.

๐ŸŽ‰ Robot moves like magic

Your humanoid robot now smoothly tracks and copies human motions across different styles and interfaces!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 46 to 65 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is MOSAIC?

MOSAIC trains generalist humanoid robot policies for motion tracking and whole-body teleoperation in Python using Isaac Sim and Isaac Lab. It processes motion datasets from sources like AMASS or custom NPZ files, converting them via scripts like batch_csv_to_npz.py, then runs PPO-based training pipelines for robots like Unitree G1, H1_2, or Adam. Developers get stable, long-horizon behaviors that bridge sim-to-real gaps through rapid residual adaptation, supporting multi-GPU runs and shell-scripted workflows for GMT policies, adaptors, and distillation.

Why is it gaining traction?

It stands out by enabling quick sim-to-real transfer for teleop across interfaces like VR or MoCap, using public datasets and modular tasks like General-Tracking-Flat-G1-v0. Scripts for motion replay, expert collection, and evaluation make prototyping fast, while residual adaptation preserves generality without full retraining. On GitHub mosaic ai repos, this hits humanoid robotics sweet spot: adaptation and bridging gaps without starting from scratch.

Who should use this?

Humanoid robotics engineers at labs like BAAI tuning Unitree G1 for real-world teleop, or researchers adapting MoCap data to sim policies in Isaac Lab. Ideal for teams needing multi-motion training on G1/H1_2 hardware, evaluating sim-to-real via play.py, or distilling expert trajectories for off-policy fine-tuning.

Verdict

Worth forking for Isaac Sim humanoid workโ€”solid pipelines despite 44 stars and 1.0% credibility signaling early stage. Docs cover install/training well, but expect tweaks for custom robots; test on small motions first.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.