Altor-lab

Altor-lab / altor-vec

Public

Client-side vector search powered by HNSW. 54KB gzipped WASM. Sub-millisecond latency.

22
2
100% credibility
Found Mar 11, 2026 at 16 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Rust
AI Summary

altor-vec provides a compact, client-side engine for fast vector similarity search in web browsers, enabling sub-millisecond semantic search without servers or API costs.

How It Works

1
🕵️ Discover fast browser search

You hear about a tiny tool that lets websites search through content super quickly right in the user's browser, without any servers or extra costs.

2
📦 Add it to your site

You easily include this search helper into your website project with a simple step.

3
📝 Prepare your content

You turn your list of items, like articles or products, into a special searchable collection using a quick preparation tool.

4
Launch your search engine

You load the prepared collection into the tool, and now your site has its own powerful search ready to go.

5
🔍 Users start searching

Visitors type natural questions on your site, and the tool finds the best matches in a blink.

🚀 Lightning-fast results

Everyone enjoys instant, private searches that feel magical, with no waiting or outside help needed.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 16 to 22 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is altor-vec?

altor-vec delivers client-side vector search powered by HNSW, compiled from Rust to a 54KB gzipped WASM module that runs in browsers. Developers load a pre-built index via npm, embed queries with models like all-MiniLM-L6-v2, and get sub-millisecond results without servers or API keys. It solves the problem of expensive, privacy-leaky cloud search by keeping everything local—your users' data stays in-browser.

Why is it gaining traction?

It crushes alternatives like Algolia on cost ($0 per query) and speed (0.6ms p95 latency on 10K vectors), while matching Voy/Orama's client-side focus but with superior HNSW recall over k-d trees or brute force. The tiny footprint and Web Worker integration keep UIs responsive, even on mobile, making it a no-brainer for semantic search without backend hassle. Rust's optimizations ensure reliable performance across Chrome and Node.js.

Who should use this?

Frontend devs building SPAs with local recommendation engines, like e-commerce product search or content feeds using client-side embeddings. PWAs needing offline vector DB capabilities, or indie hackers prototyping altor porto vecchio-style apps without infra costs. Avoid if you need millions of vectors—best for 10K-100K scale.

Verdict

Grab it for proofs-of-concept or small-scale client-side vector search; the npm API is dead simple and docs solid. With 11 stars and 1.0% credibility score, it's early-stage—test thoroughly, but the benchmarks and MIT license make it worth watching as a Rust-WASM vector search gem.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.