Local AI runtime for training & running small LLMs directly on Apple Neural Engine (ANE). No CoreML. No Metal. Offline, on-device fine-tuning & inference on M-series silicon.
Orion enables running and training small language models locally on Apple Silicon Macs using the device's neural engine for fast, private, offline AI experiences.
How It Works
You find a cool tool that lets everyday Macs run smart AI chats and even learn new things, all without sending data online.
With a few easy steps on your Mac, you build Orion and get everything ready to go.
You grab a small ready-made language model and prepare it so Orion can use it right away.
Type a question or story starter, and watch the AI reply super fast on your Mac.
Give it simple stories or data, and let it learn to get even better over time.
Your Mac's built-in brainpower makes the AI think lightning-fast, with total privacy since nothing leaves your device.
Now you have a personal, offline AI for fun chats, writing help, or custom learning, running smoothly on your Mac.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.