yliust

yliust / Tactile

Public

Tactile: an accessibility-first operating layer for agents.

62
3
100% credibility
Found May 12, 2026 at 62 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Tactile provides AI agents with tools to interact with desktop apps using operating system accessibility features, OCR as fallback, and verification steps for reliable control.

How It Works

1
📰 Discover Tactile

You hear about Tactile, a smart way for AI helpers to control apps by feeling their buttons and menus instead of guessing.

2
💬 Tell your AI to use it

You simply ask your AI chat to set up the Tactile skill from its shared guide.

3
🔗 Link your AI helper

Your AI connects to the thinking service, ready to sense app structures clearly.

4
📱 Choose an app and task

You pick a familiar app like a messenger or video editor and describe what to do.

5
👆 AI touches the app

Your AI explores the app's layout, taps buttons, types text, and moves smoothly without fumbling.

6
Task finishes perfectly

Everything works as expected, with your AI confirming each step succeeded.

🎉 Apps feel alive

Now your AI handles real app tasks reliably, making computer use magical and accessible.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 62 to 62 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is Tactile?

Tactile is an accessibility-first operating layer for AI agents, enabling them to interact with macOS and Windows apps by querying semantic UI elements—roles, names, hierarchy, states—before pixel-guessing screenshots. In Python with native binaries, it installs as a Codex skill, offering CLI tools to list apps, open by name/bundle ID, traverse trees, OCR regions, and send inputs like clicks or keypresses. Tactile meaning "touch" fits: agents feel app structure directly via OS APIs.

Why is it gaining traction?

It flips the script on brittle vision agents (tactile vs linear pipelines), starting with reliable semantics for fewer retries and tokens, then OCR-grounded coordinates, visual fallback last. Demos prove it: agents handle Lark/WeChat flows or CapCut edits flawlessly. Tactiles push apps toward human-AI shared paths, surfacing accessibility gaps devs can fix.

Who should use this?

Agent builders automating desktop tasks on macOS/Windows, like RPA scripters or OpenAI Codex users prototyping workflows. Ideal for QA engineers testing UIs semantically or researchers in tactile VLA agents (tactile vla github). Skip if you need cross-OS generality now.

Verdict

Worth a spin for agent work—quick Codex install, clear demos, verification ladders boost reliability despite 62 stars and 1.0% credibility score. v0 maturity means spotty app support, but pair with vision tools for production.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.