Geek66666

Geek66666 / nim4cc

Public

NIM4CC 是一个面向公开使用的 NVIDIA NIM 兼容网关,目标是把 NIM 的 chat/completions 能力转换成更易接入的上层协议,并补上模型目录缓存、调用统计和健康度看板。

14
0
100% credibility
Found Apr 19, 2026 at 14 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

A gateway service that translates NVIDIA NIM AI model responses into OpenAI and Anthropic compatible formats, with built-in model health monitoring and easy deployment.

How It Works

1
📰 Discover the bridge

You hear about a simple tool that lets powerful NVIDIA AI models work with your favorite chat apps like OpenAI or Claude.

2
☁️ Set up free hosting

You create a free online space and upload the tool, ready to go in minutes without any hassle.

3
🔗 Link your AI account

You add your personal NVIDIA login info once, so the tool can connect securely just for your requests.

4
▶️ Launch the service

You press start, and your AI bridge comes alive on the web, accessible anywhere.

5
📊 Check health dashboard

You visit a friendly page showing which models are running smoothly with success rates over time.

6
💬 Chat in familiar style

You send messages using OpenAI or Anthropic formats, and get smart replies with tool support.

🎉 AI works everywhere

Now your NVIDIA models play nice with all your apps, saving time and unlocking new possibilities.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 14 to 14 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is nim4cc?

nim4cc is a Python gateway that adapts NVIDIA NIM's chat/completions API into OpenAI-style /v1/responses and Anthropic-style /v1/messages endpoints, making NIM models drop-in compatible with existing clients. It caches the official model catalog, tracks call stats, and serves a public health dashboard at / and /model_list. Users get quick Docker or Hugging Face Spaces deployment, plus support for tool calling, streaming, and conversation continuation via previous_response_id.

Why is it gaining traction?

It bridges NIM to popular protocols without custom client code, including Claude Code tool patterns like bash and text_editor where clients handle execution. The real hook: automatic model sync from NVIDIA's /v1/models, per-model success rate matrices over 10-minute buckets, and API keys passed per-request without server storage. Free HF Spaces setup lowers the barrier for testing NIM models in OpenAI/Anthropic pipelines.

Who should use this?

AI engineers swapping NIM backends into OpenAI-compatible agents or chat apps. Anthropic/Claude Code users experimenting with NVIDIA models via /v1/messages, especially for client-side tools. Ops teams monitoring NIM fleet health across models like glm5 or gemma without building dashboards.

Verdict

Grab it if you need NIM protocol shims and health visuals—strong README with curl examples eases eval. At 14 stars and 1.0% credibility, it's early alpha; fork and watch for stability before prod.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.