hossaamwer

A real-time, gesture-controlled digital whiteboard built with Python, OpenCV, and MediaPipe. Features dual-layer smoothing (Deadzones + Bezier Curves) and low-latency WebSocket communication for professional-grade contactless drawing

25
3
100% credibility
Found Feb 22, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
HTML
AI Summary

A webcam-based digital whiteboard that interprets hand gestures for contactless drawing, erasing, shape creation, and document annotation in real-time via a web interface.

How It Works

1
🔍 Discover the Tool

You find a cool project online that lets you draw on a digital whiteboard using just hand gestures from your webcam.

2
💻 Set It Up

You download the files to your computer and follow easy steps to prepare everything for use.

3
🚀 Launch the Whiteboard

You start the program, open your web browser, and see your webcam feed with a blank canvas ready to go.

4
Master the Gestures

You practice simple hand moves like pinching to draw or opening your palm to erase, and it tracks smoothly without touching anything.

5
🎨 Draw and Create

You wave your hand to move the cursor, draw colorful lines, shapes, and emojis, feeling like magic as it follows perfectly.

6
📄 Annotate Documents

You load a PDF or image onto the board and add notes or drawings right on top with your gestures.

💾 Save and Share

You export your artwork as an image or save the project to keep or share your contactless creations with others.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 25 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is gesture-controlled-whiteboard?

This Python project delivers a contactless digital whiteboard where you draw via webcam hand gestures—pinch to sketch, palm to erase—streamed in real time over WebSockets to a browser canvas. Built with OpenCV and MediaPipe for gesture tracking, it tackles jittery input with dual-layer smoothing using deadzones and Bezier curves, plus tools for PDF annotation, shapes, grids, and PNG/JSON exports. Developers get a ready-to-run server that turns any laptop into a smooth, gesture-controlled drawing board.

Why is it gaining traction?

It stands out with pro-grade ink rendering via client-side Bezier curves and server-side deadzone filtering, delivering fluid lines in real-time detection scenarios that beat basic OpenCV sketches. Low-latency WebSocket communication handles multi-layer canvases without lag, and extras like adjustable brushes, emojis, and multi-page workspaces make it hook for real-time projects on GitHub. Devs dig the hybrid AI for robust gesture classification under varying lights.

Who should use this?

Computer vision tinkerers prototyping gesture UIs for kiosks or VR sketches. Presenters and teachers needing quick contactless whiteboards during hybrid meetings. Frontend devs experimenting with real-time Canvas APIs and WebSocket dashboards for interactive tools.

Verdict

Fun proof-of-concept for real-time gesture drawing, with solid docs and easy setup via pip install and python server.py—but at 15 stars and 1.0% credibility, it's early-stage; expect to tweak the model for production. Worth forking if you're into contactless controlled drawing demos.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.