jasonkneen

Pucks is a little helper for your cursor

14
0
100% credibility
Found Apr 07, 2026 at 14 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Swift
AI Summary

Pucks is a macOS menu bar app that serves as a push-to-talk AI companion, capturing screen visuals alongside voice input to deliver spoken, context-aware responses.

How It Works

1
🔍 Find Pucks

You discover a friendly menu bar buddy for your Mac that chats with you by voice and sees your screen to help.

2
📱 Set it up on your Mac

Follow easy steps to get the app running right in your menu bar with no dock clutter.

3
🔓 Allow basic access

Give permission for microphone to hear you and screen to understand what you're looking at.

4
🤝 Link smart helpers

Connect voice listening, thinking brain, and speaking voice services so it can chat naturally.

5
🗣️ Start talking

Click the menu bar icon, hold your special key combo, speak about your screen, and feel the magic.

6
💬 Get helpful replies

It thinks, moves a blue cursor to point things out, and speaks back warm, useful answers.

Your screen buddy

Now an always-ready voice companion lives in your menu bar, helping you learn, think, and create effortlessly.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 14 to 14 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is PucksApp?

PucksApp is a Swift-based macOS menu bar app that acts as a voice-activated AI companion for your cursor—hold a global hotkey to talk, and it grabs a screenshot of your screen (highlighting the cursor position), transcribes your speech, queries Claude AI with visual context, then speaks the response via ElevenLabs TTS. No dock icon, just a subtle tray presence with optional floating mic button and screen overlays for states like listening or navigating. It's a hands-free helper for quick explanations of UI elements, code snippets, or selected text without tabbing out of your editor.

Why is it gaining traction?

It stands out by tying voice queries directly to your cursor and screen—say "what's this?" and Claude analyzes exactly what's under your pointer, even animating a blue cursor to point at relevant spots during responses. Global push-to-talk works anywhere, with real-time transcription fallbacks (AssemblyAI, OpenAI, or Apple Speech) and conversation history for context. Developers dig the seamless macOS integration: no app switching, just talk and get spoken feedback while coding.

Who should use this?

macOS developers grinding in Xcode, VS Code, or Figma who hate context-switching for AI help—perfect for frontend devs debugging layouts ("explain this button"), backend folks refactoring selected code ("suggest improvements"), or designers querying UI flows hands-free. Ideal if you're on Apple Silicon, need screen-aware assistance without cloud proxies eating your API quota.

Verdict

Worth a spin for macOS power users seeking a little pucks cursor helper, but with only 14 stars and 1.0% credibility score, it's early-stage—docs are solid via README, but expect tweaks for edge cases like multi-monitor setups. Fork and build if Swift/macOS overlays excite you; otherwise, monitor for maturity.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.