AlexMasson

Let an LLM choose your Radarr/Sonarr releases instead of tweaking Custom Formats forever. Webhook proxy that intercepts grabs, queries available releases, and asks an AI to pick the best one based on your natural language criteria.

17
0
100% credibility
Found Feb 11, 2026 at 13 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

This tool uses AI to pick the best movie and TV releases for automatic downloading in custom media organizers, based on simple plain-language instructions you provide.

How It Works

1
🔍 Find the smart media picker

While managing your movie and TV collections with your favorite organizers, you discover a helpful tool that lets AI choose the best downloads for you.

2
📦 Get it running easily

Grab the ready-to-go package and start the background helper with a simple setup guide.

3
🔗 Link your collection managers

Connect it to your movie organizer and TV show organizer so they can share download options.

4
🤖 Add an AI thinking partner

Hook up a smart AI service to make decisions based on what you like.

5
✏️ Write your wish list

Jot down everyday instructions like 'choose high-quality with lots of sharers, skip tiny files' for your preferred video levels.

6
⚙️ Turn on AI assistance

In your collection apps, flip the switch to let the AI suggest picks before grabbing files.

Perfect downloads on autopilot

Now your movies and shows download the best versions automatically, with notes when the AI makes a smarter choice.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 13 to 17 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is arr-llm-release-picker?

This Python webhook proxy lets an LLM pick the best Radarr or Sonarr releases based on your natural language criteria, like "prefer 4K HDR with good seeders under 25GB, skip YIFY." It intercepts pre-grab decisions from customized Radarr/Sonarr forks via Download Decision Override webhooks, queries an OpenAI-compatible LLM (local or cloud), and overrides the default pick if better. Ditch endless Custom Format tweaks—let the LLM handle it for you, with fail-safes falling back to arr defaults on timeout or error.

Why is it gaining traction?

No more rigid scoring rules: per-profile prompts mean TV packs and movies get tailored logic, like letting LLMs break free from overthinking via natural criteria. Docker-ready with /test and /simulate endpoints for dry runs, ntfy notifications on overrides, and skip tags for exceptions—plus hot-reload prompts without restarts. Works with any LLM API, so pair it with LiteLLM proxies or local Ollama for cheap automation.

Who should use this?

Self-hosted media hoarders running Radarr/Sonarr who hate fiddling with quality profiles and custom formats. Automation devs building *arr stacks wanting LLM smarts for release pruning, or tinkerers testing "let LLM tell what to prune" in real grabs. Ideal if you're already on the forks and scripting GitHub let's-do-automation flows.

Verdict

Early alpha with 13 stars and 1.0% credibility—solid README and Docker setup, but unproven at scale, no tests visible. Grab it for *arr experiments if you trust forks; otherwise, wait for more traction.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.