fortunto2

Idiomatic Rust client for OpenAI API — 1:1 parity with openai-python

12
0
100% credibility
Found Mar 25, 2026 at 12 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

openai-oxide is a high-performance OpenAI API client library for Rust, Node.js, and Python, optimized for agentic workflows with features like structured outputs, low-latency streaming, WebSockets, and hardware-accelerated parsing.

How It Works

1
💡 Discover speedy AI chats

You hear about openai-oxide, a super-fast helper that makes talking to AI lightning-quick in everyday coding languages.

2
🗣️ Pick your favorite language

Choose Rust, Node.js, or Python – it fits right into what you already use without hassle.

3
📦 Grab it with one click

Add this handy tool to your project super simply, like downloading a fun app.

4
🔗 Connect your AI buddy

Share a private password so your new helper can chat with the smart AI service.

5
💬 Send your first message

Ask a question and watch the clever reply appear blazingly fast – it feels magical!

6
Unlock live answers & helpers

See responses stream in real-time or use smart tools for bigger tasks effortlessly.

🚀 Your AI projects zoom!

Build zippy chat apps or smart helpers that wow friends with their speed.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 12 to 12 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is openai-oxide?

openai-oxide is an idiomatic Rust client for the OpenAI API, delivering 1:1 parity with openai-python across chat completions, responses, embeddings, and 20+ endpoints. It provides Rust crates, Node.js npm packages (github idiomatic js), and Python PyPI bindings (idiomatic python github), plus WASM support for edge runtimes. Users get structured outputs via parse(), zero-copy streaming, and persistent WebSockets for agent loops.

Why is it gaining traction?

Benchmarks show it beats async-openai and genai in Rust, plus official SDKs in Python/Node by 10-37% on TTFT, multi-turn, and hedged requests. Features like early function-call parsing (400ms faster tools) and WebSocket mode (37% latency drop) target real agent pain points, while auto-sync with OpenAI's OpenAPI spec ensures day-zero updates. Parameter names match openai-python exactly, easing migration.

Who should use this?

Rust backend devs building low-latency AI agents or researchers chaining 50+ tool calls. Node.js API teams handling rapid-fire completions in serverless. Python scripters needing typed structured outputs without extra Pydantic setup, especially in WASM/Cloudflare Workers.

Verdict

Grab it for perf-critical OpenAI work—benchmarks and docs outshine maturity (12 stars, 1.0% credibility). Early but production-ready with 100% spec coverage; pair with idiomatic rust patterns from the idiomatic rust book (reddit favorite) for robust code.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.