lispking

A blazing-fast MCP (Model Context Protocol) server for multi-engine web search, written in Rust.

10
0
100% credibility
Found Mar 31, 2026 at 16 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

ferris-search is a compact tool that enables AI coding assistants to perform reliable web searches across multiple engines, even in restricted networks.

How It Works

1
😕 AI search fails

Your coding AI buddy can't look up info on the web because of network blocks or spotty connections.

2
🔍 Discover ferris-search

You find this speedy helper that lets your AI search many websites reliably instead.

3
🚀 Grab and set it up

Download the tiny tool and connect it to your AI coding app with a few simple steps.

4
⚙️ Choose search options

Pick your go-to search sites like Bing or DuckDuckGo, and add helpers if needed for special sites.

5
🔗 Link to your AI

Tell your AI app like Claude or Cursor to use this new search helper.

6
Test a search

Ask your AI a question needing web info, and watch it pull fresh results instantly.

🎉 Supercharged coding

Your AI now finds exactly what you need online, making coding faster and smarter every time.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 16 to 10 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is ferris-search?

ferris-search is a blazing-fast Rust MCP (Model Context Protocol) server that routes web searches across 14 engines like Bing, DuckDuckGo, Baidu, GitHub repos/code, and Chinese sites (CSDN, Juejin, Zhihu). It solves unreliable built-in search in AI coding tools like Claude Code or Cursor, especially on corporate networks, behind proxies, or in restricted regions—delivering results via a tiny 8MB binary with no runtime deps. Users get fan-out searches, content fetchers for articles/READMEs, and proxy support via env vars.

Why is it gaining traction?

It ditches Node.js overhead for Rust's async speed and low memory, making multi-engine queries snappier than alternatives like open-webSearch. Proxy setup shines for GFW or enterprise firewalls, and no-API-key engines keep it plug-and-play. Devs hook it into Claude via `claude mcp add` for instant ferris search reliability.

Who should use this?

Claude/Cursor users in China or locked-down corps needing Baidu/CSDN results; enterprise teams building internal RAG over Confluence/GitLab; Rust fans wanting a lightweight MCP proxy for ferris smart search in air-gapped setups.

Verdict

Grab it if Claude's search flakes—solid docs and quickstart make setup trivial, despite 10 stars and 1.0% credibility signaling early maturity. Fork for custom engines; it's a lean base worth watching.

(178 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.