KarryViber

Orb Eye — Accessibility Service that gives AI five senses on Android. HTTP API for UI tree, notifications, tap, swipe, setText and more.

55
9
100% credibility
Found Feb 11, 2026 at 15 stars 4x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Java
AI Summary

Orb Eye is an Android app that equips on-device AI agents with the ability to observe screen contents, monitor notifications, and perform user interactions like tapping, swiping, and typing.

How It Works

1
📱 Discover Orb Eye

You come across Orb Eye, a handy tool that lets AI on your Android phone see the screen, hear alerts, and interact by touching and typing.

2
📥 Get the App

You download the ready-to-use app file straight from the project's sharing page.

3
🔧 Put It on Your Phone

You install the app on your Android phone just like any other app.

4
👁️ Turn On Super Vision

In your phone's settings, you flip the switch for Orb Eye's special helper feature, granting it permission to watch and touch the screen.

5
🤖 Link Your AI

Your AI companion on the phone connects effortlessly, and now it can peek at what's on screen and respond.

6
AI Comes Alive

Your AI gains full senses – it reads text, catches notifications, taps buttons, swipes, and types just like you do.

🎉 Full Control Unlocked

Everything works smoothly; your phone's AI now navigates apps, handles alerts, and acts independently with perfect awareness.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 55 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is orb-eye?

Orb Eye is a Java-based Android Accessibility Service that lets on-device AI agents "see" the UI tree, "hear" notifications, "touch" with taps/swipes/long-presses, "speak" by injecting text (CJK supported), and "wait" for changes via a local HTTP API on localhost:7333. It solves the gap for AI running directly on phones—like in github orb agent setups—needing full sensory access without rooting or heavy frameworks. Users get simple curl-friendly endpoints for screen dumps, clicks by text/bounds, notification buffers, and app info.

Why is it gaining traction?

Unlike clunky alternatives like opencv orb github tools or orb slam github libs focused on vision algos, this delivers a dead-simple HTTP bridge to Android's native accessibility APIs, with event-driven waiting to skip polling. The hook: it's self-written by an AI on a phone (zero human code), clocks in lightweight, and plugs straight into agents via OpenClaw or BotDrop—no setup hell. Devs dig the precision gestures and real-time notifs for fluid automation.

Who should use this?

Android AI builders crafting autonomous orb eye scanner agents that navigate apps via HTTP calls. Automation scripters testing UIs without ADB hacks, or devs prototyping orb eyes discord bots that interact with phone screens. Skip if you're after orb eye surgery-level maturity—best for on-device experiments.

Verdict

Early alpha with 13 stars and 1.0% credibility score means thin tests and docs, but solid MIT-licensed API makes it worth a spin for niche Android AI hacks. Test thoroughly; not production-ready yet, more orb eye care oakville than orb eye of sauron.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.