luoxyhappy / CoInteract
PublicOfficial Implementation of CoInteract: Spatially-Structured Co-Generation for Interactive Human-Object Video Synthesis
CoInteract is a research project that generates realistic videos of humans interacting with objects driven by speech, featuring spatial control for natural movements, with code and models forthcoming.
How It Works
You find this new project on GitHub that promises to create realistic videos of people talking and handling objects together.
You play the video to see lifelike scenes of humans interacting with everyday items, guided by spoken words.
You learn how it smartly positions hands, faces, and objects for natural-looking actions without extra effort.
You visit the linked page and paper to understand the clever ways it makes videos feel real and controllable.
You star the page on GitHub to stay in the loop as they prepare everything for you to try soon.
Once released, you'll easily make your own custom videos of people and objects coming to life interactively.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.