alisorcorp / ask-local
PublicDelegate grunt work from Claude Code to a local LLM via LM Studio. File contents stay on your machine; only the final answer enters your Claude session.
ask-local enables delegating file reading, directory listing, and pattern searching tasks from cloud AI coding sessions to a local language model in LM Studio to avoid consuming cloud tokens.
How It Works
You hear about ask-local, a clever way to let a local brain handle tedious file-checking jobs for your online AI coding buddy, saving precious thinking time.
Download the files and run the one-click setup to place the helper tools where your AI buddy can find them.
Start the free local AI app on your computer and load a smart model so it's ready to explore files.
Open a conversation with your online AI coding assistant and point it to your project folder.
Type a simple slash command like 'find all notes to fix' or 'list key parts of this folder' — your local brain jumps in to read and search without slowing your main chat.
Watch as lists, summaries, and findings appear quickly, with notes on what it checked and how much it used.
You now analyze big projects, hunt bugs, or inventory code effortlessly, keeping your online AI fresh for the big ideas.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.