parallax-labs

Local-first context ingestion and retrieval for AI tools. SQLite + embeddings + MCP server for Cursor & Claude.

28
2
100% credibility
Found Feb 27, 2026 at 24 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

Context Harness is a local-first framework that ingests documents from filesystems, Git repositories, S3 buckets, and custom scripts into a searchable SQLite database with optional embeddings, exposed via CLI and an MCP-compatible HTTP server for AI tools like Cursor and Claude.

How It Works

1
🔍 Discover a smarter way for AI

You find a tool that lets AI assistants pull answers straight from your own files and repos, no more vague guesses.

2
📦 Get it set up fast

Download the ready-to-go program and launch it on your computer in moments, everything works out of the box.

3
📁 Point to your knowledge

Simply tell it where your documents, project folders, or online storage live so it knows what to use.

4
🔄 Pull everything together

With one go, it gathers all your info into a private, searchable spot on your machine.

5
🧠 Unlock deep understanding

It adds smarts so searches find exactly what you mean, blending words and meaning perfectly.

6
🔌 Connect to your AI helper

Link it to tools like your coding assistant or chat app, and they instantly know your world.

AI gives spot-on answers

Now your AI pulls precise info from your own knowledge every time, saving hours and boosting your work.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 24 to 28 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is context-harness?

Context Harness is a Rust-built, local-first tool for ingesting docs from files, Git repos, S3 buckets, or Lua scripts into a SQLite database with optional embeddings. It chunks text, supports hybrid keyword+semantic retrieval via CLI (`ctx search`), and runs an MCP server so Cursor and Claude can query your context without cloud dependencies. Think offline RAG for your repos and runbooks.

Why is it gaining traction?

It stands out with zero-setup binaries across platforms, incremental syncs that checkpoint changes, and extensible connectors via simple Lua—no recompiles needed. The MCP server plugs straight into Cursor/Claude for instant context retrieval, plus local embeddings (fastembed or Ollama) keep everything offline and cheap. Devs love the CLI-first flow: `ctx sync all && ctx serve mcp`.

Who should use this?

Backend engineers managing multi-repo docs or S3 runbooks who want AI tools like Cursor/Claude grounded in private knowledge. Teams ditching vendor-locked context ingestion for a local-first GitHub alternative that scales to production via Docker. Ideal if you're tired of copy-pasting docs into AI chats.

Verdict

Grab it if you need reliable local context for Claude/Cursor—docs are polished, Nix flakes ready, and Rust ensures it's fast/snappy. At 23 stars and 1.0% credibility, it's early but stable enough for daily use; star it and contribute connectors to push maturity.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.