NVIDIA

SOMA BVH to humanoid robot motion retargeting library built with Newton and NVIDIA Warp

84
3
100% credibility
Found Mar 18, 2026 at 100 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A tool that converts human motion capture data from the SOMA body model into joint position files playable on Unitree G1 humanoid robots using GPU-accelerated inverse kinematics.

How It Works

1
🔍 Discover the motion converter

You hear about a fun tool from NVIDIA that turns human dance or walk videos into moves a robot can copy.

2
💻 Set up on your computer

You create a simple space for the tool and add it with a few easy steps, needing just a modern NVIDIA graphics card.

3
🚀 Open the 3D preview window

You start the viewer and see an empty 3D space ready for action, with handy side panels.

4
📁 Load your human motion

You pick a motion capture file of a person moving, and watch the lifelike human figure appear and dance in 3D.

5
Transform to robot moves

With one click, the tool matches the human actions to a robot body right beside it, smoothly adjusting arms, legs, and torso in real-time.

6
💾 Save or process more

You save the robot-ready motion file, or point it at a folder to convert lots of clips at once without watching.

🤖 Robot dances your motion!

Your robot now perfectly copies the human moves you captured, bringing animations to life on real hardware.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 100 to 84 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is soma-retargeter?

Soma-retargeter is a Python library built with Newton and NVIDIA Warp that converts BVH human motion captures from the SOMA body model into CSV joint animations playable on humanoid robots like Unitree G1. It solves the pain of manually adapting mocap data to robot kinematics by applying proportional scaling, multi-objective GPU IK, feet stabilization, and joint limits. Users get an interactive OpenGL viewer for scrubbing and tweaking, or headless batch conversion of BVH folders to mirrored CSV outputs.

Why is it gaining traction?

It stands out in the motion retargeting space with NVIDIA Warp's high-performance compute for fast IK on large batches, plus built-in handling for SOMA's uniform skeleton—no fiddly custom scripting needed. The quickstart viewer lets you load, retarget, and export in seconds, while configs for scaling and stabilization keep robot feet planted realistically. Ties into NVIDIA's ecosystem like SOMA-X and SEED dataset for ready human motions.

Who should use this?

Robotics engineers porting human mocap to Unitree G1 or similar bipeds for locomotion testing. Animation pipelines devs using SOMA data who need robot-compatible outputs without Blender roundtrips. NVIDIA Warp/Newton users prototyping humanoid motion retargeting.

Verdict

Grab it if you're in humanoid robotics and need BVH-to-CSV fast—solid for prototypes despite 84 stars and 1.0% credibility score signaling early days. Actively developed under Apache 2.0, but expect API tweaks; test with included samples first.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.