praneybehl

Andrej Karpathy's LLM Wiki pattern as a Claude Code plugin — turn accumulated sources into a self-maintaining, scalable markdown knowledge base.

10
0
100% credibility
Found Apr 16, 2026 at 10 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A plugin for AI coding assistants that enables building and maintaining a personal markdown-based wiki from documents, with AI handling ingestion, querying, and upkeep.

How It Works

1
🔍 Discover the wiki helper

While looking for smart ways to organize your research notes and articles with AI, you stumble upon this plugin for building a personal knowledge collection.

2
📥 Add it to your project

You simply tell your AI assistant to install the plugin, and it becomes available right in your workspace.

3
🏗️ Create your wiki space

With one quick command, you set up neat folders for your original documents and the AI-managed wiki pages.

4
📄 Add your documents

Drop PDFs, articles, transcripts, or notes into the raw folder to start building your knowledge base.

5
🤖 AI curates everything

Ask your AI to process the new document, and it magically creates linked summary pages, updates connections, and keeps your wiki growing smartly.

6
Ask and explore

Pose any question to your wiki, and get clear answers with direct links to the relevant pages.

7
Keep it healthy

Run simple checks now and then to spot issues and let the AI suggest tidy-ups, ensuring it stays fresh.

🎉 Your knowledge thrives

Over weeks, your personal wiki becomes a powerful, self-maintaining hub of insights that answers questions effortlessly.

Sign up to see the full architecture

6 more

Sign Up Free

Star Growth

See how this repo grew from 10 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is llm-wiki-plugin?

This Python plugin for Claude Code brings Andrej Karpathy's LLM Wiki pattern from his blog and GitHub gist to life, transforming accumulated sources like papers, transcripts, and notes into a self-maintaining markdown knowledge base. Drop files into a raw folder, run slash commands like /wiki:ingest to compile them into structured pages for entities, concepts, and sources, then query with /wiki:query for cited answers. It auto-updates links and summaries, scaling to thousands of pages without bloating agent context, and bundles scripts for BM25 search, linting, and stats.

Why is it gaining traction?

Unlike basic RAG that reprocesses raw docs every time, this compounds knowledge persistently—ingest once, query fast—like Karpathy's auto research agents on GitHub. Slash commands and agent skills work across Claude Code, Cursor, Codex, and more, with Obsidian-friendly markdown and frontmatter for easy viewing. Devs dig the zero-maintenance vibe: lint catches issues, stats flag scaling steps, turning weeks of research into a living wiki.

Who should use this?

AI researchers piling up papers or transcripts in the style of Andrej Karpathy's GitHub projects like micrograd and nanogpt. Devs doing vibe coding or auto research agents who want a Claude MD knowledge base over scattered notes. Obsidian power users seeking LLM automation for accumulated textual domains, not relational data.

Verdict

Early days with 10 stars and 1.0% credibility score, but solid docs and MIT license make it worth forking for experiments. Try if you're into Karpathy's skills and x posts—scales well, but test script execution in non-Claude agents first.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.