Arthur-Ficial

Arthur-Ficial / apfel

Public

Apple Intelligence from the command line. On-device LLM via FoundationModels framework. No API keys, no cloud, no dependencies.

20
1
100% credibility
Found Mar 25, 2026 at 20 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Swift
AI Summary

apfel is a macOS app offering simple ways to chat with Apple's built-in on-device AI model through text prompts, a graphical window, or connections to other tools.

How It Works

1
🔍 Discover apfel

You find a free tool that lets you chat with the smart AI already built into your Mac.

2
📥 Get it on your Mac

Download and set it up easily with one simple action on your Apple Silicon Mac running the latest system.

3
🚀 Pick your favorite way

Choose how to start chatting: quick text, full window, or connect to other apps you use.

4
Ask and get magic replies

Whatever you wonder about, the AI thinks and responds helpfully right away, feeling smart and private.

5
🔊 Talk with your voice

Speak your thoughts into the mic and hear the answers read back in a natural voice.

6
🔍 See inside the magic

Peek at what you said and what the AI thought to understand or tweak it better.

Your personal AI buddy

Enjoy free, on-Mac smarts for questions, ideas, or fun anytime without accounts or costs.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 20 to 20 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is apfel?

Apfel is a Swift tool that brings Apple Intelligence to your macOS command line via the FoundationModels framework, running fully on-device with no API keys, cloud, or dependencies. Fire up a Unix-style CLI for quick prompts like `apfel "Summarize this log"`, spin a local OpenAI-compatible server at `http://127.0.0.1:11434/v1/chat/completions`, or launch a native GUI debugger to inspect raw prompts, SSE streams, and refusals. It's perfect for apple intelligence aktivieren on Apple Silicon Macs running macOS 26+, turning built-in LLM into a developer harness without the apfelbaum schneiden hassle.

Why is it gaining traction?

Unlike cloud LLMs, apfel drops in as a local OpenAI server—point your Python scripts or curl at it for streaming chat completions, models list, and logs/stats endpoints. The GUI shines for transparency: dissect request JSON, copy curl repros, test voice I/O, and run self-discussions comparing prompts, all while pruning safety refusals from history. With 20 stars, it's an apple github repository hooking devs tired of opaque APIs, especially for apfelkuchen rezept-style prompt tweaking.

Who should use this?

macOS shell scripters piping text into on-device summaries, Swift devs prototyping apple intelligence deutschland features without cloud latency, and prompt engineers debugging apfelessig abnehmen refusals via live logs. Ideal for apple github io embedding in local tools or apfelstrudel debates in self-discussion mode.

Verdict

Grab it if you're on macOS 26+ Apple Silicon—install via `make install`, and it fills Apple's CLI/server gap neatly with solid docs. At 1.0% credibility and 20 stars, it's early but stable for niche use; watch for broader model support.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.