Pratiyush

Pratiyush / llm-wiki

Public

LLM WIKI

18
1
100% credibility
Found Apr 09, 2026 at 18 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

llmwiki transforms transcripts from AI coding sessions into a searchable, interlinked static website knowledge base with AI-readable exports.

How It Works

1
👀 Discover llmwiki

You find this tool that turns your forgotten AI coding chats into a personal knowledge library.

2
📥 Set it up quickly

Download and run the simple setup to prepare your personal wiki space on your computer.

3
🔄 Gather your sessions

It automatically finds and pulls in your past AI assistant conversations from where they are saved.

4
Build your wiki

With one command, it creates a beautiful, searchable website full of your organized knowledge, charts, and links.

5
🌐 Open and browse

Launch a local preview to explore your new wiki with search, heatmaps, and easy navigation.

🎉 Your knowledge unlocked

Now you have a private, AI-friendly library of all your coding insights, ready to query or share.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 18 to 18 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is llm-wiki?

LLM Wiki turns dormant session transcripts from Claude Code, Cursor, Codex CLI, Gemini CLI, and Obsidian into a local, interlinked knowledge base following Andrej Karpathy's LLM Wiki pattern. It ingests JSONL logs via adapters, builds a static Python-generated site with search, heatmaps, token charts, and project freshness badges, plus AI exports like llms.txt, JSON-LD graphs, and per-page .txt/.json siblings. Two CLI commands—sync and build/serve—yield a browsable wiki at localhost:8765.

Why is it gaining traction?

Minimal deps (stdlib + markdown), no servers/databases, and MCP tools let agents query your wiki directly, unlike heavier llm github repository setups. Live demo on GitHub Pages shows real features on dummy data, with E2E tests locking UI regressions. Dual human/AI formats feed llm wikipedia rag pipelines or local models, standing out in llm github projects space.

Who should use this?

Claude Code or Cursor users with hundreds of unused transcripts; AI devs tracking llm github copilot sessions across projects. Perfect for personal llm wiki english knowledge bases on coding patterns, models, or law/tech topics without cloud lock-in.

Verdict

Worth cloning for Claude-heavy workflows—v0.9 delivers with 420 passing tests and polished docs, despite 18 stars and 1.0% credibility signaling alpha maturity. MIT license and idempotent CLI make local llm github integration experiments low-risk.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.