raechao93

Open-source SDK for AI-powered hand gesture interaction in XR (AR/VR), it provides the real-time hand tracking and gesture recognition that works across XR platforms ( Meta Quest, Pico, HoloLens, ARCore) through a single API built on OpenXR.

18
0
100% credibility
Found Mar 17, 2026 at 18 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

OpenGestureXR is an open-source toolkit for recognizing hand gestures from a webcam and using them to interact with objects in Unity-based XR applications.

How It Works

1
🔍 Discover hand magic for VR

You find OpenGestureXR, a fun tool that lets you control virtual worlds with simple hand waves and poses.

2
💻 Get the sensing software

Download the hand-sensing program to your computer and prepare it to watch your hand movements.

3
🤲 Teach it your gestures

Point your webcam at your hand, hold poses like grab or point, and record examples so it learns your style.

4
📹 Start watching your hands

Launch the hand tracker – it uses your computer's camera to spot and name your gestures in real time.

5
🎮 Add to your VR project

Place the special connectors into your Unity game to bring hand controls to life.

6
🔗 Link your VR world

Connect your game to the hand tracker, and suddenly your real hands reach into virtual space.

Play with hand powers

Grab objects, point to select, thumbs up to confirm – your VR adventure responds to every natural gesture!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 18 to 18 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is OpenGestureXR?

OpenGestureXR is a Python-based open source SDK that delivers real-time hand tracking and gesture recognition for XR apps, unifying input across platforms like Meta Quest, Pico, HoloLens, and ARCore via a single OpenXR API. Developers run a local FastAPI server with WebSocket or REST endpoints to stream gestures like grab, pinch, point, thumbs-up, and peace from any webcam, powered by MediaPipe. A Unity plugin hooks it into your scenes for instant object interaction, grab/release, and selection—perfect for prototyping without platform-specific hand tracking.

Why is it gaining traction?

It stands out as a self-hosted open source SDK alternative to proprietary XR hand tracking, letting you train custom ONNX models from webcam data via simple CLI tools and benchmark latency on your hardware. The gesture API streams at 30fps with confidence scores, and Unity integration maps gestures to actions like grabbing objects, smoothing out flickers for reliable UX. Developers dig the cross-platform promise without vendor lock-in, akin to open source sdks for maps or PDFs but tuned for XR gestures.

Who should use this?

Unity XR developers prototyping gesture UIs on Quest or Pico, indie AR/VR teams skipping native OpenXR hand tracking boilerplate, or HoloLens builders needing a quick webcam-based fallback. Ideal for self-hosted setups where you control the AI pipeline, like training peace or thumbs-up for custom interactions.

Verdict

Grab it for alpha-stage experiments—18 stars and 1.0% credibility score signal early days with solid MIT-licensed bones, but expect to tweak for production. Strong docs and tools make it a low-risk open source SDK to fork across your XR stack.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.