unidex-ai

unidex-ai / UniDex

Public

[CVPR 2026] UniDex: A Robot Foundation Suite for Universal Dexterous Hand Control from Egocentric Human Videos

46
1
100% credibility
Found Mar 28, 2026 at 46 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

UniDex is a research codebase for processing human hand videos, retargeting motions to various robot hands, and training AI foundation models for dexterous robotic manipulation.

How It Works

1
🔍 Discover UniDex

You hear about UniDex, a tool that teaches robots to use their hands just like humans by watching everyday videos.

2
🛠️ Prepare your workspace

You install the simple tools and download ready-made building blocks to get everything set up on your computer.

3
📹 Collect human videos

You download videos of people using their hands for tasks like grabbing objects or pouring drinks.

4
Translate to robot moves

UniDex magically converts those human hand movements into instructions your robot hand can follow perfectly.

5
🧠 Train the robot brain

You run a training session where the robot learns precise hand control from the adapted videos.

6
⚙️ Customize for real use

You tweak the model with your own robot data to make it work perfectly on your hardware.

🤖 Robot masters dexterous tasks

Your robot now grabs, pours, and manipulates objects with human-like skill and precision!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 46 to 46 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is UniDex?

UniDex converts egocentric human videos into actionable trajectories for dexterous robot hands, enabling universal control across datasets like H2O, HOI4D, Hot3D, and Taco. You download raw videos, run retargeting to map human poses to robot kinematics for hands like Inspire, Leap, or Oymotion, then pre-train or finetune vision-language models via simple Python scripts. Built on PyTorch Lightning and Hydra, it outputs checkpoints deployable on Hugging Face for real-world robot manipulation.

Why is it gaining traction?

Tied to a CVPR 2026 paper (scan cvpr 2026 papers github or cvpr 2026 reddit for early hype around cvpr 2026 accepted papers), it unifies preprocessing and training for multiple datasets/hands—something scattered in prior cvpr 2024 papers github or cvpr 2025 papers github repos. Developers love the end-to-end pipeline: asset downloads, one-command retargeting, multi-GPU training. Beats cobbling together custom IK solvers or dataset loaders.

Who should use this?

Robotics researchers training dexterous hands from human demos, like pouring coffee with Inspire or grasping in HOI4D setups. Suited for labs chasing CVPR 2026 workshops or rebuttals, or engineers adapting egocentric video policies to new hardware without rebuilding from scratch.

Verdict

Grab for CVPR 2026 template/poster github experiments (deadline/dates buzz on reddit), but at 46 stars and 1.0% credibility, it's raw—training marked "in development," docs setup-focused. Solid research starter, debug for prod.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.