yz671

yz671 / viewllm

Public

Single-binary HTML report viewer for LLM-generated artifacts

15
0
100% credibility
Found May 13, 2026 at 15 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
HTML
AI Summary

viewllm provides a lightweight, instant web viewer for browsing, previewing, and sharing HTML reports generated by AI coding agents from any folder.

How It Works

1
💡 Discover viewllm

You hear about a simple way to beautifully view and share the rich HTML reports your AI helper creates, instead of struggling with plain text previews.

2
📥 Get it ready

You grab the viewer with one quick command or download, no complicated setup needed.

3
🚀 Start in your folder

Open a command window in your project folder and launch the viewer—it finds your reports right away.

4
See magic happen

Open your web browser to instantly browse your AI reports with previews, thumbnails, search, and live updates as new ones appear.

5
Choose how to share
🏠
Local sharing

Everyone nearby opens the same address to see reports together.

🌍
Public link

Get a temporary web link that works over the internet without any extra hassle.

6
👥 Team stays in sync

Your friends or coworkers open the link, see unread markers, and everyone keeps up without confusion.

🎉 Rich reports shared easily

Now your AI's beautiful charts, diagrams, and analyses are simple to view and collaborate on, like magic.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 15 to 15 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is viewllm?

viewllm is a single-binary Go tool that spins up a web viewer for HTML reports and LLM-generated artifacts in any directory. Drop it into a folder of .html files from AI agents like Claude Code or Cursor, run `npx viewllm@latest`, and get a searchable UI with thumbnails, previews, and real-time updates as new files appear. It solves the pain of viewing rich HTML outputs—no more broken VS Code previews over SSH/WSL or raw directory listings.

Why is it gaining traction?

Its 9MB binary starts in 100ms with zero deps, leaving no trace, and works seamlessly in headless setups via terminal. Developers hook on instant sharing: local network links or public URLs with `-tunnel` using Cloudflare, plus per-device themes and unread tracking for teams. Search, file tree, and mobile support beat clunky alternatives like python http.server.

Who should use this?

AI coding agent users generating HTML reports for architecture analysis or data viz. Teams at Anthropic or Cursor shops sharing LLM-generated artifacts without email zips. Remote devs in WSL/SSH needing a reliable viewer for interactive charts.

Verdict

Try it via npx for LLM HTML workflows—solid docs and UX punch above its 15 stars and 1.0% credibility score. Still early (no tests visible), so monitor for polish before production.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.