F-R-L

F-R-L / forge-film

Public

Multi-model DAG-driven parallel AI film generation — parallel speedup scales with scene independence; Generate film scenes simultaneously instead of one by one; "把影视生成的执行图从拓扑序变成关键路径最优调度" ; 唯一把场景叙事依赖建模为 DAG、以 CPM 算法驱动并行调度的影视生成引擎

20
0
100% credibility
Found Mar 27, 2026 at 20 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Forge automates converting text stories into polished video films by breaking them into scenes, generating clips with various AI video tools in parallel while maintaining visual continuity, and assembling everything into one seamless MP4.

How It Works

1
🔍 Discover Forge

You stumble upon Forge, a fun tool that magically turns your written stories into complete movies without any hassle.

2
📥 Set it up

Download and prepare it on your computer – it's quick and easy, like installing a simple app.

3
✏️ Write your story

Type in a short tale about a detective adventure or whatever sparks your imagination.

4
🎬 Plan the scenes

Choose the number of scenes, and watch it smartly outline your story into a flowing movie blueprint.

5
🚀 Generate the film

Hit start, and it creates video clips for each scene at the same time, blending colors smoothly for perfect flow.

Watch your movie

Get a ready-to-play final video file that feels like a real short film you can share with friends.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 20 to 20 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is forge-film?

Forge-film turns text stories into AI-generated short films using Python. You feed it a script, it compiles scenes into a DAG for dependencies, routes dialogue to Kling, landscapes to local CogVideoX, generates clips in parallel with CPM scheduling, matches colors across models, and outputs final.mp4 via CLI like `forge run story.txt --workers 4` or web UI. Solves the pain of serial logins to multiple video APIs, manual stitching, and jarring cuts in forge filmmaking.

Why is it gaining traction?

Multi-model mixing with pluggable backends beats single-model tools like Seedance, while CPM-driven parallelism delivers real speedup on independent scenes—serial 30min jobs hit 20min. Open-source MIT, local GPU options, and mock mode hook devs prototyping multimodal ai github pipelines or multi model github flows. Config via forge.yaml keeps it flexible without lock-in.

Who should use this?

AI creators making forge film trailer or forge film sxsw entries from prompts. Indie devs automating multi model llm github video gen for prototypes. Filmmakers blending cloud APIs like Kling with free local models, ditching manual dag-driven workflows.

Verdict

Promising alpha for dag-driven multi-model video (20 stars, 1.0% credibility score), with strong README, benchmarks, and 20 tests—but incomplete backends limit production. Fork for custom cpm orchestration; track for forge film 2025 maturity.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.