voidmind-io

voidmind-io / voidllm

Public

The privacy-first LLM proxy for teams. Org/team/key hierarchy, usage tracking, rate limiting — zero knowledge of your prompts.

15
2
100% credibility
Found Mar 25, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

VoidLLM is a self-hosted service that proxies requests to AI providers for teams, enforcing access controls, rate limits, token budgets, and usage tracking while ensuring prompt privacy.

How It Works

1
🏢 Discover VoidLLM

You hear about a simple tool that lets teams safely share and track AI conversations without spying on private chats.

2
📥 Grab and start it

Download the ready-to-run package and launch it on your computer or server in moments.

3
🖥️ Open your control center

Visit the welcoming web dashboard to set up your organization, teams, and secure sharing rules.

4
🔗 Link AI helpers

Point it to your favorite AI services so it routes requests smartly and watches usage.

5
👥 Share access safely

Create private passes for team members or apps, with limits on chats and spending.

6
📈 Track everything

Watch live charts of team activity, costs, and trends, all while keeping messages private.

🎉 Team AI unlocked

Your group collaborates with powerful AI securely, with full oversight and no surprises.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 15 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is voidllm?

VoidLLM is a privacy-first LLM proxy built in Go that sits between your apps and providers like OpenAI, Anthropic, or Ollama. It enforces org/team/key hierarchy for access control, tracks usage and costs without ever storing prompts—true zero-knowledge architecture—and applies rate limiting and token budgets at every level. Teams get a full web UI for dashboards, playground, and key management, plus OpenAI-compatible endpoints that work with any SDK.

Why is it gaining traction?

Unlike proxies that log your data, VoidLLM passes prompts through memory-only, ensuring privacy-first LLMs with GDPR compliance out of the box. Model aliases let you swap providers without changing client code, while real-time tracking spots budget overruns instantly. Sub-2ms overhead and one-binary Docker/Helm deploys make it dead simple for self-hosted teams.

Who should use this?

Dev leads at startups sharing API keys across teams, engineering managers tracking LLM spend per project, or security-conscious orgs routing traffic through Ollama/vLLM with fine-grained limits. Ideal for mid-sized teams (10-100 devs) evaluating multi-provider setups without vendor lock-in.

Verdict

Grab it if you need a lightweight, privacy-first proxy today—docs and Docker Compose are production-ready despite 15 stars and 1.0% credibility score. Still early; watch for more adoption before enterprise scale.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.