slokam-ai

slokam-ai / localgcp

Public

The unified GCP emulator. One binary, 14 services, zero cloud bills.

11
1
100% credibility
Found Apr 14, 2026 at 11 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Go
AI Summary

localgcp is a single executable that emulates 14 Google Cloud services locally for cost-free development and testing, including Vertex AI integration with local LLMs via Ollama.

How It Works

1
🔍 Discover localgcp

You hear about a handy tool that lets you run Google Cloud services right on your own computer, saving time and money during development.

2
📥 Get it set up

Download or install the single program with a simple command, no complicated setup needed.

3
🚀 Start everything

Run one easy command and watch all the cloud services come alive on your machine instantly.

4
🔗 Connect your apps

Tell your existing programs to talk to your local machine instead of the real cloud, often just by setting a few simple pointers.

5
🧠 Use AI features locally

Chat with smart AI models running on your computer for quick tests and ideas, no internet or costs involved.

🎉 Build freely

Now develop, test, and experiment offline with full cloud power at your fingertips, fast and free forever.

Sign up to see the full architecture

4 more

Sign Up Free

Star Growth

See how this repo grew from 11 to 11 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is localgcp?

localgcp is a single Go binary that emulates 14 GCP services locally, from GCS and Pub/Sub to Firestore, Cloud Tasks, Vertex AI, KMS, Logging, and Cloud Run, plus orchestrated Spanner, Bigtable, Postgres, Redis, and BigQuery. It eliminates cloud bills for dev and testing by letting official GCP SDKs connect via standard emulator env vars like STORAGE_EMULATOR_HOST, with zero code changes for most. Run `localgcp up`, eval its env output, and your GCP apps hit localhost instead of the cloud.

Why is it gaining traction?

Unlike scattered standalone emulators, localgcp delivers a unified GCP experience in one binary—no juggling Docker for everything, lazy orchestration for heavy services, and local Vertex AI inference via Ollama for real LLM prompts without API keys or quotas. Devs love the instant setup (brew install or Docker run), persistent data dirs, and CI-friendly stubs, slashing latency and costs on every debug cycle.

Who should use this?

GCP backend engineers iterating on Pub/Sub queues, Firestore queries, or Vertex AI integrations during local dev. QA teams running integration tests without cloud credentials or bills. Startups prototyping GCP unified data platforms, maintenance tools, or security services offline.

Verdict

Grab it for fast GCP local dev—docs are crisp, tests solid, CLI intuitive—but with 11 stars and 1.0% credibility score, treat as alpha for production CI. Promising unified emulator if you're tired of cloud tab costs.

(187 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.