allenai

allenai / molmospaces

Public

An end-to-end open ecosystem for robot learning

145
3
100% credibility
Found Feb 12, 2026 at 64 stars 2x -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
Python
AI Summary

MolmoSpaces is an open collection of simulated homes, objects, robots, and robot challenges designed for training AI policies to handle navigation and manipulation tasks.

How It Works

1
🌟 Discover robot playgrounds

You find a huge collection of realistic homes, kitchens, and objects perfect for teaching robots everyday skills like picking apples or opening doors.

2
🔧 Set up your workshop

With a few simple steps, you create a special space on your computer where everything gets ready automatically.

3
📥 Bring in the worlds

Realistic rooms, furniture, robots, and gadgets download and organize themselves neatly in your space.

4
👀 Peek into a home

Open a viewer to explore a cozy kitchen or living room, seeing every detail just like real life.

5
🤖 Watch robots practice

Run tests where robots try picking objects, navigating rooms, or opening cabinets, recording their moves.

6
📊 Check their skills

See scores on how well they succeed at tasks, like successfully grabbing a mug or finding a chair.

🎉 Build smarter robots

Use these practice videos and worlds to train your own robot helpers that master household chores effortlessly.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 64 to 145 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is molmospaces?

MolmoSpaces delivers a massive open dataset of robot-ready assets—1.1k hand-crafted kitchen objects, 130k Objaverse models, procedurally generated scenes from iTHOR/ProcTHOR/Holodeck, plus robots like Franka and RBY1—for manipulation and navigation tasks. Built in Python, it auto-downloads and versions everything via an asset manager, making scenes plug-and-play in MuJoCo, Isaac Sim, or ManiSkill. Developers get an end-to-end ecosystem for robot learning, from grasp generation and teleop via iPhone app to benchmarks for pick/open/close tasks.

Why is it gaining traction?

Unlike fragmented sim datasets, it unifies assets across simulators with one install command (`uv pip install -e .[dev,grasp]`), JSON benchmarks, and data gen pipelines for custom episodes. The hook? Scripted planners produce high-success demos fast, while phone teleop and eval runners let you benchmark VLMs without rebuilding from scratch—ideal for end-to-end machine learning projects on GitHub.

Who should use this?

Robot researchers training VLMs on manipulation (pick/place/open doors) or navigation; sim devs needing diverse, articulated scenes without manual asset hunting; teams prototyping policies in MuJoCo before Isaac hardware transfer.

Verdict

Grab it for robot sim baselines—AllenAI's polish shines through solid docs and Apache 2.0 licensing—but at 49 stars and 1.0% credibility score, it's early-stage; expect bugs in edge cases until more community testing.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.