Mcourtyard

M-Courtyard: Local AI Model Fine-tuning Assistant for Apple Silicon. Zero-code, zero-cloud, privacy-first desktop app powered by Tauri + React + mlx-lm.

52
4
100% credibility
Found Feb 17, 2026 at 24 stars 2x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
TypeScript
AI Summary

A user-friendly desktop app for Apple Silicon Macs that guides non-technical users through importing documents, generating training data, fine-tuning language models locally with mlx-lm, testing, and exporting to Ollama.

How It Works

1
📱 Download the app

Get the free Mac app that turns your documents into custom AI models, no cloud needed.

2
🚀 Create your first project

Start a new space for your AI training adventure with one click.

3
📄 Add your documents

Drag in text files, PDFs, or Word docs – the app handles everything automatically.

4
🧠 Smartly prepare training data

Watch as the app cleans your files and uses AI to create perfect Q&A pairs or conversations.

5
âš¡ Train your personal AI

Pick a starting model, hit train, and see live charts showing your model getting smarter.

6
💬 Chat with your new AI

Test it right in the app – ask questions and see how much better it understands you.

🎉 Export and use anywhere

One-click save as a ready-to-run model you can chat with in any AI app.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 24 to 52 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is m-courtyard?

M-Courtyard is a desktop app for Apple Silicon Macs that handles end-to-end local AI model fine-tuning with zero code or cloud dependency. Drop in txt, docx, or pdf files; it cleans them, generates training datasets via Ollama or rules, trains LoRA adapters using mlx-lm, lets you test via chat, and exports quantized models to Ollama. Built as a privacy-first Tauri + React app, it turns raw documents into deployable models entirely on your M-series Mac.

Why is it gaining traction?

It skips CLI headaches and script-juggling of tools like mlx-lm directly, offering a guided 4-step UI with live loss charts, crash recovery, and one-click exports—ideal for quick iterations without setup friction. Multi-source model discovery (Ollama, HuggingFace, ModelScope) and presets like Quick/Standard/Thorough make experimentation fast, while sleep prevention and incremental saves handle long runs reliably.

Who should use this?

Apple Silicon owners fine-tuning small-to-medium models (3B-14B) from personal docs, like researchers building domain-specific Q&A bots or writers imitating styles. It's for ML-curious devs or non-experts avoiding cloud costs and data leaks, especially those near a courtyard by Marriott M or M Pavilion Eastern Courtyard needing offline assistants.

Verdict

Grab the DMG for local fine-tuning if you're on M1+ Mac—docs are solid and workflow shines, despite 22 stars and 1.0% credibility signaling early days. Test with toy datasets first; scale cautiously until more adoption confirms stability.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.