kytmanov

Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero cloud. Zero sharing. Your notes stay yours.

26
4
100% credibility
Found Apr 08, 2026 at 26 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

This tool uses local AI to transform raw markdown notes into an interconnected, self-updating wiki that integrates with Obsidian for visualization and querying.

How It Works

1
🔍 Hear about smart notes

You discover a way to turn your scattered notes into a growing, connected wiki that gets smarter with every addition.

2
📦 Set it up simply

Install the helper tool on your computer in moments, no fuss.

3
🧠 Add a local thinker

Download a free AI brain that runs right on your machine to understand your notes.

4
📁 Make your wiki space

Create a special folder where your wiki will live and grow.

5
Drop in notes

Toss your raw notes into the folder and let the AI read them, pull out key ideas, and draft connected pages.

6
👀 Check and approve

Review the AI's draft pages, tweak if needed, and say yes to publish them.

🌐 See your knowledge web

Open your wiki in Obsidian to explore a beautiful graph of linked ideas, ask questions, and watch it expand forever.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 26 to 26 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is obsidian-llm-wiki-local?

This Python tool builds Karpathy's LLM Wiki vision 100% locally with Ollama: drop Markdown notes into a folder, and it extracts concepts, auto-links them, and grows your Obsidian vault into a self-maintaining knowledge base. Raw notes stay immutable in raw/, while AI compiles structured wiki articles with wikilinks, backlinks, and traceability—no cloud, no sharing required. Open the vault in Obsidian for graph views and queries out of the box.

Why is it gaining traction?

It stands out by keeping everything local via Ollama, skipping vector DBs or APIs for a simple pipeline: ingest notes, compile drafts for review, approve to publish, with olw watch auto-triggering on drops. Git commits every step with olw undo for safety, and olw lint/query add wiki health checks plus Q&A—all tuned for 7-14B models on consumer hardware. Developers dig the zero-telemetry purity and Obsidian-native output that compounds smarter over time.

Who should use this?

Obsidian power-users drowning in unlinked Markdown notes from web clips, meetings, or research. Python devs or researchers building private PKMs without cloud lock-in, especially those running Ollama who want AI to handle concept extraction and linking instead of manual backlinks.

Verdict

Worth a spin for local LLM wiki fans—excellent docs, CLI (olw setup/init/ingest/compile), and offline tests make setup painless despite 26 stars and 1.0% credibility signaling early maturity. Fork or watch if you need persistent, editable knowledge that grows without forgetting.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.