otadk

otadk / nuxt-edge-ai

Public

Nuxt module for local-first AI apps with server-side WASM inference via Transformers.js and ONNX Runtime.

33
0
100% credibility
Found Mar 19, 2026 at 33 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

Nuxt-edge-ai is a module for Nuxt apps that adds AI text generation and chat capabilities using local WebAssembly runtimes or remote services with OpenAI-style interfaces.

How It Works

1
🔍 Discover the AI helper

You stumble upon nuxt-edge-ai while wanting to add smart chat features to your web app without hassle.

2
📦 Add it to your app

You easily include this tool in your project so your app can start using AI right away.

3
Pick your AI style
🏠
Local magic

Choose the option that keeps everything speedy and private on your site.

☁️
Online boost

Connect to big AI services for even smarter responses when needed.

4
🤖 Wake up the AI

You prepare the AI brain by grabbing a ready model, and it gets excited to help.

5
💬 Chat away

In your app, you type messages and watch the AI reply just like a helpful friend.

AI comes alive

Your web app now feels alive with intelligent conversations, delighting everyone who uses it.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 33 to 33 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is nuxt-edge-ai?

nuxt-edge-ai is a TypeScript Nuxt module that brings local-first AI inference to your apps, running server-side WASM models via Transformers.js and ONNX Runtime without Ollama, Rust, or native deps. Drop it into nuxt.config.ts alongside github nuxt auth utils or github nuxt content, and get Nitro API routes like /api/edge-ai/chat/completions for OpenAI-style calls, plus a useEdgeAI() composable for client-side access. It handles model pulls, health checks, and remote OpenAI fallbacks, keeping your Nuxt apps AI-ready out of the box.

Why is it gaining traction?

It stands out in the Nuxt modules ecosystem—think github nuxt i18n or github nuxt ui—by bundling a full WASM runtime for text generation on Node/Nitro servers, no extra installs needed. Developers dig the mock mode for nuxt github actions testing, presets like distilgpt2 for quick starts, and seamless remote proxying to OpenAI or OpenRouter. Low overhead means it slots into nuxt modules folder without bloating deploys, unlike heavier alternatives requiring Python backends.

Who should use this?

Nuxt module authors prototyping AI features in fullstack apps, like chat interfaces or content generators paired with github nuxt content. Frontend teams building local-first experiences, avoiding edge runtime limits on Vercel or Cloudflare. Check nuxt github example repos or playgrounds before integrating into production alongside nuxt module tailwind or nuxt github pages sites.

Verdict

Early days with 33 stars and 1.0% credibility score—docs are solid with a playground, but expect tweaks via nuxt github issues as local inference matures. Grab it for nuxt module create experiments if you're in the ecosystem; skip for mission-critical apps until streaming and more models land.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.