amitrechavia

A TypeScript implementation of langsmith MCP

17
0
100% credibility
Found Feb 17, 2026 at 17 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A TypeScript server implementing the Model Context Protocol to enable AI assistants to query and retrieve data from LangSmith, including conversation histories, prompts, traces, datasets, experiments, and usage metrics.

How It Works

1
๐Ÿ” Discover the helper

You find this handy tool that lets your AI assistant peek into your LangSmith workspace full of chat histories, prompts, and experiment results.

2
๐Ÿ“‹ Gather your details

Jot down your LangSmith login info to securely connect everything later.

3
๐Ÿ”— Link your account

Simply share your LangSmith access so the helper can reach your saved conversations and data.

4
๐Ÿš€ Start the service

Run a quick one-click launch to get the helper up and running on your computer.

5
โš™๏ธ Hook it to your AI

Tell your AI coding buddy, like in Cursor or Claude, where to find this new helper.

6
๐Ÿ—ฃ๏ธ Chat and fetch

Ask your AI things like 'show me the latest chat history' or 'list my prompts' and watch it pull the info.

๐ŸŽ‰ Everything flows smoothly

Now your AI effortlessly manages your LangSmith projects, saving you time on digging through histories, stats, and more.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 17 to 17 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is langsmith-mcp-server-js?

This TypeScript implementation of the LangSmith MCP server lets AI tools like Cursor or Claude query your LangSmith workspace via the Model Context Protocol. Set LANGSMITH_API_KEY and run npx langsmith-mcp-server to expose tools for fetching conversation threads, prompts, traces, datasets, experiments, and billing usage. It solves the hassle of manual LangSmith SDK calls by giving LLMs direct, paginated access to your observability data.

Why is it gaining traction?

As a full port of the official Python version, it delivers 100% functional parity in TypeScript, with stateless char-based pagination that keeps LLM responses under token limits. npx quickstart and JSON config for MCP clients make it dead simple to hook into GitHub Copilot or Claude Codeโ€”no Docker or builds needed. Developers dig the tools for real workflows, like pulling latest traces from a project or listing public prompts.

Who should use this?

LangSmith-heavy teams debugging LLM apps in IDEs like Cursor, where you need on-demand trace history or dataset examples mid-conversation. TS devs building agents that analyze experiments or billing without switching tabs. Perfect for prompt engineers querying templates via natural language.

Verdict

Grab it if you're in the LangSmith ecosystem and want a TypeScript MCP bridgeโ€”docs are crisp, tests exist, and it runs out of the box. With 17 stars and 1.0% credibility score, it's early but faithful; watch for community growth before production.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.