vakovalskii

Local Topsha 🐧 AI Agent for simple PC tasks - focused on local LLM (GPT-OSS, Qwen, GLM)

133
28
100% credibility
Found Feb 03, 2026 at 42 stars 3x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

LocalTopSH is a self-hosted AI agent framework that enables deployment of privacy-focused Telegram bots using local LLMs with per-user sandboxes and an admin dashboard.

How It Works

1
🔍 Discover LocalTopSH

You hear about a private AI helper that runs fully on your own computer, keeping all chats safe and local.

2
🧠 Prepare your AI brain

Connect a free local AI service like Ollama so your assistant can think and respond smartly.

3
🔑 Add private access

Share your Telegram details securely so the assistant can chat with you there.

4
🚀 Launch with one click

Hit start and watch your personal AI assistant come alive on the internet.

5
📊 Customize in dashboard

Use the friendly web panel to tweak settings, view logs, and manage users easily.

6
💬 Chat in Telegram

Message your assistant like a friend – it helps with tasks while staying completely private.

Private AI ready!

Your secure, self-hosted helper works perfectly, saving data costs and ensuring full privacy.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 42 to 133 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is LocalTopSH?

LocalTopSH is a Python framework for deploying a self-hosted AI agent focused on local LLMs like GPT-OSS, Qwen, and GLM, handling simple PC tasks such as shell execution, file ops, and scheduling via Telegram bot. Users get a full stack—admin panel at localhost:3000, API at :4000, and per-user Docker sandboxes—launched with docker compose up, ensuring no data leaves your network. It supports vLLM, Ollama, or llama.cpp backends for tasks like local github actions runner or querying local news Topsham Maine today.

Why is it gaining traction?

It beats cloud agents with zero token costs (just GPU power), battle-tested security blocking 247 patterns and prompt injections, plus MCP for external tools and skills for extensibility. Devs dig the React admin UI for toggling tools, viewing logs, and managing users, alongside Telegram integration for real-world use like local weather Topsham Maine or Topsham local election results. As a local github copilot alternative, it runs anywhere, including sanctioned regions.

Who should use this?

DevOps teams building local github server or local github instance for air-gapped compliance, enterprises needing on-premise agents for code review without leaks, or solo devs in restricted areas like Russia wanting a focused LLM agent for daily tasks. Suited for local github actions or Topsham local authority automation without cloud risks.

Verdict

Early project at 71 stars and 1.0% credibility score shows solid docs and docker simplicity, but low maturity means test sandboxes thoroughly before prod. Strong pick for privacy-first local LLM agents if you need Telegram + admin control now.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.