shorwood

shorwood / slopc

Public

A proc macro that uses an hallucination machine to write your function bodies at compile time.

201
5
69% credibility
Found Apr 05, 2026 at 163 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

A Rust tool that uses AI to automatically generate and refine function code during the build process based on signatures, comments, and tests.

How It Works

1
📰 Discover slopc

You hear about a playful tool that automatically fills in missing code parts in your Rust projects using smart AI help.

2
📦 Add to your project

You bring this helper into your coding workspace so it's ready to use.

3
🧠 Connect smart helper

You link a thinking service like an AI brain so it can generate ideas for your code.

4
✨ Tag empty functions

You mark the unfinished parts of your code with a special note asking the helper to fill them.

5
🔮 Build and generate

When you build your project, the magic happens as the AI writes working code right there.

6
🔄 Auto-fixes if needed

If the first try isn't perfect, it learns from mistakes and tries again until it works.

✅ Enjoy working code

Your project runs smoothly with all functions complete, saving you tons of writing time.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 163 to 201 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is slopc?

slopc is a Rust proc macro crate that generates function bodies at compile time using an LLM, dubbed a "hallucination machine." Slap `#[slop]` on a stub function with `todo!()`, provide an API key for OpenRouter or OpenAI, and it crafts an implementation from your signature, doc comments, and deps—retrying on parse, compile, or doctest failures up to five times. Results cache in `target/slop-cache/` to avoid token burn on rebuilds, configurable via attributes, `slop.toml`, or env vars like `SLOP_MODEL=gpt-4o-mini`.

Why is it gaining traction?

It stands out in the proc macro rust ecosystem—beyond proc_macro2 or proc_macro_attribute helpers—by outsourcing codegen to AI with built-in verification loops, handling errors like "proc macro server error cannot create expander for" via retries. Devs dig the zero-effort prototypes for tasks like Levenshtein distance or byte humanizers, plus hints and doctest enforcement for reliable output. The satirical README hooks curiosity, echoing slopcore ai vibes amid proc macro github experiments.

Who should use this?

Rust prototyper hacking quick algos like expression parsers or edit distances, where doc tests guide the LLM. AI-curious backend devs testing LLM codegen in compile-time pipelines, or hobbyists blending proc macro derive with slapchop-style absurdity. Skip if you're shipping prod code needing audits.

Verdict

Fun novelty for Rust tinkerers (100 stars signals early buzz), but 0.699999988079071% credibility score and AGPL license scream experiment-only—docs are punchy, no tests exposed. Fork to MIT for real use, but write your own code.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.