conspol

conspol / acty

Public

Async grouped-job runtime for Python, AI agents, and LLM systems, with retries, fair scheduling, JSONL event logs, and a live terminal TUI.

13
0
100% credibility
Found Apr 07, 2026 at 13 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

Acty is a Python library for managing groups of related asynchronous tasks, especially for AI agents and language model workflows, with built-in retries, resource fairness, event logging, and a real-time terminal dashboard.

How It Works

1
đź“– Discover acty

You hear about acty, a helpful tool that keeps related tasks together for smoother AI workflows.

2
🛠️ Add to your project

You easily add acty to your Python setup so it's ready to use.

3
đź§  Plan your tasks

You outline one main setup task followed by related follow-up jobs that share its context.

4
🚀 Launch your workflow

You send off the group of tasks and watch them start working together automatically.

5
đź‘€ Watch progress live

You open a colorful screen in your terminal to see tasks running, queues, and speeds in real time.

🎉 Tasks complete smoothly

Your setup and follow-ups finish reliably with retries and fair sharing, giving clear results and logs.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 13 to 13 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is acty?

Acty runs grouped async jobs in Python for AI agents and LLM systems, starting with a primer job that sets up shared context like sessions or caches, then fans out follower jobs that reuse it. You get built-in retries, fair scheduling via lanes for multi-tenant workloads, JSONL event logs, and a live terminal TUI for monitoring runs. It's perfect for workflows where setup unlocks related tasks, avoiding manual queue stitching in async Python.

Why is it gaining traction?

Unlike flat task queues, acty's primer/follower model explicitly handles agent shapes—coordinator first, then specialists—preventing bursts from starving new groups and enabling cache warmup for prefix-caching in LLMs. The TUI shines for live debugging long agent runs, with replay/follow CLI for JSONL logs, plus seamless LangChain and OpenAI integrations. Devs dig the lanes for prioritizing important jobs amid noisy async github actions or agent swarms, without custom hacks.

Who should use this?

Python devs building multi-agent LLM orchestrators, like release-readiness coordinators fanning to test/perf/ops specialists, or support playbooks warming shared context for incremental queries. Ideal for async github python workflows, agent prototypes needing fair retries, or anyone tired of ad-hoc futures in AI pipelines—no more rebuilding session state per task.

Verdict

Grab it for agent experiments if you need grouped async with a killer TUI (pip install acty; acty-tui follow events.jsonl)—docs and examples are crisp, LangChain addon ready. At 13 stars and 1.0% credibility, it's alpha-fresh; test lightly until more adoption.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.