RipeMangoBox

The official implementation of ReactDance: Hierarchical Representation for High-Fidelity and Coherent Long-Form Reactive Dance Generation (ICLR 2026)

12
0
100% credibility
Found Mar 10, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

ReactDance is a research tool for generating long, coherent dance sequences where one dancer realistically reacts to music and a partner's movements.

How It Works

1
🌐 Discover ReactDance

You find this fun project online and get excited about creating realistic partner dances that react to music and each other.

2
💻 Prepare your workspace

You easily set up a simple space on your computer to run the dance maker.

3
📥 Gather dance examples

You download ready-to-use dance clips and music to teach the system real moves.

4
🧠 Teach basic dance blocks

The system learns core body movements from your examples, building a strong foundation.

5
🎓 Train full dance partner

You guide it to create smooth, reactive dances that follow music and match a partner's style.

6
Generate new duets

Pick music and a leader's moves, then watch it create a perfect reacting partner dance.

💃🕺 Enjoy your dance videos

Celebrate as you watch high-quality, lifelike duet videos come to life, ready to share!

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 12 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is ReactDance?

ReactDance generates high-fidelity, coherent long-form dance motions where a follower's movements react smoothly to music and a leader's poses. This official Python implementation, targeting ICLR 2026, uses PyTorch and Lightning to train a two-stage pipeline: first a hierarchical quantizer for motion reps, then a diffusion model for generation. Users run simple CLI commands like `python hfsq.py --mode train` or `reactdance.py --mode sample` to produce duet videos and 3D poses from datasets like DD100LF.

Why is it gaining traction?

It stands out by tackling coherence in extended reactive dances—alternatives often jitter or lose fidelity over long sequences. The official repository provides ready configs, eval metrics (FID, MPJPE), and vis tools, letting devs prototype duet generation fast without custom pipelines. Early buzz comes from its hierarchical reps enabling stable, music-partner synced outputs.

Who should use this?

Motion gen researchers testing diffusion on dance data, AI artists building interactive duet sims, or multimodal ML devs exploring audio-pose conditioning. Ideal for those with SMPL-X datasets wanting quick baselines for long-horizon coherence.

Verdict

Grab it if you're in dance AI research—solid docs and CLI make it usable now, despite 1.0% credibility from 10 stars signaling early maturity. Add tests and pretrained weights for broader appeal.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.