agoda-com

Grafana plugin for visualizing LLM traces in Tempo — inspect prompts, completions, tool calls, token usage, and costs. Supports OpenInference, OTel GenAI, and Vertex AI span conventions.

13
0
100% credibility
Found Apr 09, 2026 at 13 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A Grafana app plugin for browsing, searching, and inspecting traces of large language model interactions stored in Grafana Tempo.

How It Works

1
📖 Discover the tool

You hear about a handy Grafana add-on that lets you peek inside AI conversations stored in your traces.

2
💾 Add it to Grafana

Download the ready-made package and drop it into your Grafana folder, then restart to make it live.

3
🔗 Pick your traces

Open the new LLM Traces section, choose where your traces are kept, and you're ready to explore.

4
🔍 Find AI activity

Type a simple search or use smart filters to spot traces where your AI tools were chatting or thinking.

5
📊 Dive into a trace

Click any trace to unfold its timeline, seeing every step your AI took from start to finish.

6
💬 Read the conversation

Examine messages between user and AI, tool uses, token counts, and even cost estimates, all neatly displayed.

🎉 Unlock AI insights

Now you clearly see what your AI did, why it took time, and how much it cost, making debugging and optimization a breeze.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 13 to 13 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is llm-traces?

llm-traces is a TypeScript Grafana app plugin for visualizing LLM traces stored in Tempo, letting you browse and search via TraceQL while inspecting prompts, completions, tool calls, token usage, and costs. It handles OpenInference, OTel GenAI, and Vertex AI conventions natively, rendering messages in Markdown, showing span hierarchies on timelines, and offering resizable detail panels. Drop it into your Grafana setup to debug llm agent traces without leaving your dashboard.

Why is it gaining traction?

It plugs straight into existing Tempo datasources—no grafana github datasource tweaks or custom grafana plugin development needed—and auto-detects conventions for instant value. Standout perks include per-span cost estimates, tool call JSON views, and TraceQL-powered filtering, all in a responsive UI that beats raw Tempo traces. Developers grab it from grafana github releases for quick grafana plugin install and grafana plugins directory listing.

Who should use this?

Observability engineers at AI teams instrumenting with Phoenix, Traceloop, or OpenTelemetry, needing grafana github integration to monitor llm agent traces. Backend devs debugging production LLM chains in Grafana Tempo, especially those hitting grafana plugin not found errors with other tools. Suited for anyone chasing token costs and tool failures without grafana plugin infinity hacks.

Verdict

Solid pick for Tempo users—download from grafana github releases, allow unsigned plugins, and restart for grafana plugin health check passed. With 13 stars and 1.0% credibility score, it's immature but feature-complete with strong docs, tests, and Docker provisioning; track grafana github issues for community momentum.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.