TheTom / vllm-swift
PublicvLLM Metal plugin powered by mlx-swift — high-performance LLM inference on Apple Silicon
A native backend for serving large language models at high speed on Apple Silicon Macs, compatible with standard AI serving tools via an OpenAI-like interface.
How It Works
You find a tool that lets you run smart AI models super quickly on your Apple computer without slowing down.
Use a simple command to add and install everything you need on your Mac.
Pick a smart AI model and let the tool grab it for you into your personal folder.
Start the server with one command, and your local AI comes alive, ready for action.
Connect your favorite apps or tools, ask questions, and get speedy smart replies.
Enjoy a powerful personal AI assistant running smoothly at home, faster than ever.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.