NoEdgeAI

NoEdgeAI / iphoneclaw

Public

iphone+AI, open source apple intelligence, let agent take over your iPhone

44
2
100% credibility
Found Feb 09, 2026 at 26 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

An open-source macOS tool that enables AI agents to automate iPhone interactions by controlling the iPhone Mirroring window.

How It Works

1
📱 Discover iPhoneClaw

You hear about a fun tool that lets smart AI take control of your iPhone screen right from your Mac using the mirroring feature.

2
💻 Set up on your Mac

Download the tool, install it simply, and give it permission to see your screen and move the mouse.

3
Pick your AI helper
🏠
Run AI at home

Set up a screen-smart AI on your Mac so it works offline and stays private.

☁️
Use cloud AI

Sign up for a ready-made AI service online and connect it with a simple password.

4
🔍 Test the connection

Snap a picture of your mirrored iPhone screen to make sure the AI can see it clearly.

5
🎯 Give a simple command

Tell it something easy like 'Open Settings and turn on Wi-Fi' and watch it tap, scroll, and type on its own.

6
⏱️ Watch it learn and speed up

Over time, it remembers common screens and acts faster without asking the AI every step.

iPhone tasks done automatically

Now your AI handles everyday iPhone chores effortlessly while you relax.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 26 to 44 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is iphoneclaw?

iphoneclaw is a Python CLI that lets AI agents take over your iPhone through macOS iPhone Mirroring, bringing open-source Apple Intelligence to iPhone+AI automation. Pair it with vision models like UI-TARS or Qwen VL to capture the mirrored screen, parse instructions, and execute taps, scrolls, drags, or typing via Quartz events. Run tasks like "Open Settings and enable Wi-Fi" with `iphoneclaw run --instruction "...""`, record real user gestures into replayable scripts, or expose a supervisor API for external control.

Why is it gaining traction?

It stands out with user recording (`script record-user`) for instant script creation, L0 memoization to skip repeated VLM calls on familiar screens, and one-command replays (`script run`) for deterministic flows like app launches or check-ins. The supervisor API (SSE events, pause/resume/inject) plugs seamlessly into Claude or Codex as a UI worker, while OCR via Apple Vision adds text awareness without extra deps. Python simplicity means quick setup with self-hosted vLLM or cloud APIs.

Who should use this?

AI agent builders automating iPhone routines like red packet grabs or energy collection in apps. macOS devs testing iOS flows without physical devices. Researchers prototyping Apple Intelligence agents for repetitive tasks on iOS 18+.

Verdict

Try it for iPhone+AI experiments—solid CLI, docs, and model integration make prototyping fast despite 27 stars and 1.0% credibility score. Still early; add tests and community diaries for production reliability.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.