larsderidder

Transparent proxy between AI tools and LLM APIs. Logging, redaction, and capture.

16
0
100% credibility
Found Feb 23, 2026 at 14 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A local tool that intercepts, logs, and optionally anonymizes data sent from AI coding assistants to language model services.

How It Works

1
🔍 Discover privacy worries

You hear about a simple tool that lets you watch and protect what your AI coding helpers send to smart chat services, so nothing private slips out.

2
📦 Get the easy installer

You grab the handy command-line buddy with one quick download, no fuss needed.

3
🛡️ Turn on your protector

You fire it up and tell it to guard your favorite AI tool, like your coding chat friend, with a simple go command.

4
💬 Chat away safely

Your AI tool works just like always, but now everything passes through your personal shield first.

5
📊 Peek at the records

You check the saved notes to see exactly what got shared, spot any private bits, and feel in control.

Total peace of mind

Now you use AI tools confidently, knowing your personal info stays safe on your machine while still getting great help.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 14 to 16 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is contextio?

Contextio is a TypeScript transparent proxy that sits between AI coding tools like Claude CLI or Aider and LLM APIs from Anthropic, OpenAI, or Gemini. It logs full requests and responses to disk, with optional redaction of PII, emails, or secrets before data hits the API—reversible mode even restores originals in streaming responses. Install the global CLI (`npm i -g @contextio/cli`) and run `ctxio proxy --redact -- claude` to wrap tools via env vars, or use Docker for deployment.

Why is it gaining traction?

Unlike basic mitmproxy setups, it auto-handles base URL overrides and chains mitmproxy for stubborn tools, delivering live `ctxio monitor` views with token counts, latency, and costs. Reversible redaction and session replay (`ctxio replay capture.json`) let you audit prompts without breaking workflows, all in a zero-dep proxy that's easy to self-host as a github transparent proxy. Devs love the inspect command for exposing system prompts and overhead.

Who should use this?

AI-assisted coders relying on CLIs like Claude, Gemini, Aider, or even Copilot who need visibility into what's gossiped to LLMs. Prompt debuggers chasing token waste, or security-conscious teams blocking PII leakage in local dev before production deploys.

Verdict

Worth npm installing for local AI tool observability—CLI shines, docs cover Docker and custom policies. At 14 stars and 1.0% credibility, it's pre-mainstream; solid TypeScript but unproven at scale, so fork if needed.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.