bhope

bhope / hedge

Public

Adaptive hedged requests for Go. Cut your p99 latency with zero configuration. Based on Google's "The Tail at Scale" paper.

29
0
100% credibility
Found Mar 27, 2026 at 29 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

Hedge is a library that speeds up applications by automatically sending backup requests to slow services it calls, learning patterns in real-time to reduce delays.

How It Works

1
🔍 Discover Hedge

You're building an app that talks to many services, but some slow ones make everything drag—then you find Hedge, a smart helper to speed things up.

2
🔌 Plug It In

Simply wrap your app's connections with Hedge in a couple of lines, no complicated setup needed.

3
🧠 It Learns Automatically

Hedge quietly watches how long each service usually takes, figuring out normal speeds on its own without you lifting a finger.

4
Spot a Slow Response

When one service starts lagging beyond its usual time, Hedge notices right away.

5
🚀 Sends Smart Backup

It fires off a quick backup call to the same service, grabbing the faster reply and canceling the slow one.

6
📊 Balances and Tracks

Hedge keeps a gentle limit on backups to avoid overload and shares simple stats on how often it helps.

🎉 Faster App, Happily

Your app now responds lightning-quick and reliably, even with tricky services, all tuned perfectly without effort.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 29 to 29 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is hedge?

Hedge is a Go library for adaptive hedged requests in fan-out architectures, slashing p99 latency by firing backup HTTP or gRPC calls to slow backends. It tracks per-host latency distributions in real-time and triggers hedges only when primaries exceed dynamic thresholds like p90, with a token bucket to cap load spikes during outages. Drop it in as a RoundTripper or UnaryClientInterceptor—no config required.

Why is it gaining traction?

Unlike static-threshold hedgers that underperform as loads shift, hedge auto-adapts via streaming sketches, matching hand-tuned p99 gains (17.3ms vs 17.5ms in benchmarks) without stale configs. Full observability via Stats counters (hedge rate, budget exhaustion) and gRPC support make it dead simple for polyglot services. Zero-config setup hooks devs tired of manual tuning.

Who should use this?

Go microservice owners fanning out to 10+ backends, where stragglers compound delays. gRPC teams battling tail latencies in aggregators or API gateways. Backend engineers at scale who want adaptive hedged requests without ops overhead.

Verdict

Grab it for prod pilots if your fan-out woes match—83% test coverage and solid benchmarks inspire confidence despite 29 stars and 1.0% credibility score signaling early maturity. Polish docs with real-world examples to boost adoption.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.