AmSach

AmSach / voxelnav

Public

Real-time semantic voxel mapping for ROS2 robots - 100ms latency on Jetson Nano

19
1
100% credibility
Found May 08, 2026 at 19 stars -- GitGems finds repos before they trend. Get early access to the next one.
Sign Up Free
AI Analysis
C++
AI Summary

This repository is a prototype ROS2 package for creating real-time 3D voxel maps from LiDAR or RGB-D sensors with semantic labels and integration scaffolding for robot navigation costmaps.

How It Works

1
🔍 Discover VoxelNav

You find this handy tool on GitHub that helps robots build a 3D map of their world, labeling things like floors, walls, and objects in real time.

2
Test it quickly

Run a simple check that builds a mini version and proves the mapping and labeling work perfectly, all in seconds without any setup.

3
🛠️ Add to your robot setup

Place it in your robot's software folder and build it once so everything is ready to go.

4
🚀 Turn on the mapper

Launch it with one easy command, adjusting block sizes or labels if you like, and it starts listening for sensor data.

5
📡 Feed in sensor views

Connect your robot's laser scans or camera pictures, and it automatically turns them into a smart 3D grid.

6
👀 See your 3D map grow

Watch colorful point clouds appear showing the space divided into labeled blocks, updating live as your robot moves.

🏆 Smarter robot paths

Your robot now navigates confidently, avoiding obstacles by understanding what's floor, wall, or furniture in the map.

Sign up to see the full architecture

5 more

Sign Up Free

Star Growth

See how this repo grew from 19 to 19 stars Sign Up Free
Repurpose This Repo

Repurpose is a Pro feature

Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.

Unlock Repurpose
AI-Generated Review

What is voxelnav?

Voxelnav processes real-time 3D sensor data like LiDAR point clouds and RGB-D images in ROS2, building voxel grids with semantic labels (floor, wall, person) and confidence scores for robot navigation. It delivers 100ms latency maps on Jetson Nano hardware, with adaptive voxel resolution that coarsens based on pose uncertainty to save compute. Launch via `ros2 launch voxelnav voxelnav.launch.py`, tweak params like `voxel_size` or `enable_semantic`, and get colored `/voxel_map` and `/semantic_voxel_map` topics out.

Why is it gaining traction?

Stands out with a standalone `verify_build.py` smoke test that compiles and runs core logic—no ROS2 setup needed—proving real-time C++ voxel hashing and ONNX semantic segmentation work instantly. Hooks developers with uncertainty-aware scaling, stale voxel pruning, label histograms, and a Nav2 costmap plugin scaffold for quick integration. Focuses on github real-time semantic segmentation and real-time semantic mapping for autonomous off-road navigation without overpromising production readiness.

Who should use this?

ROS2 navigation engineers prototyping semantic costmaps for indoor/outdoor robots, especially those fusing LiDAR with cameras for real-time detection of obstacles like people or furniture. Nav2 users experimenting with voxel layers over traditional grids, or researchers benchmarking real-time semantic segmentation for autonomous driving on edge devices.

Verdict

Grab it for demos or research—solid prototype with clear docs and tests—but skip for production until field-validated; 19 stars and 1.0% credibility score signal early days. Run `python3 verify_build.py` first to confirm it fits your ROS2 stack.

(198 words)

Sign up to read the full AI review Sign Up Free

Similar repos coming soon.