Kinematics Lab
Reference apps that run on Kinematics Mini, Max, or any NVIDIA Jetson. Inspired by the Jetson AI Lab community. Source code for every demo.
Category
Hardware
20 of 20 demos
Predefined teleop dashboard with dual cameras, 3D model, map, and controls.
Drag-and-drop dashboard — every CockPit widget, your layout.
Multi-waypoint autonomous patrols with a state-machine orchestrator.
System monitor — CPU/GPU/RAM, thermal, power rails, ROS node status.
Themes, accent colors, gamepad bindings, speed profiles per operator.
Coordinate multiple robots from one dashboard — status, missions, formation.
Language-conditioned manipulation with NVIDIA GR00T N1.5.
Open-source VLA model for manipulation — no proprietary checkpoints.
Robust manipulation via denoising diffusion over action sequences.
Hugging Face LeRobot framework — train and deploy with one config.
TensorRT INT8 detection on dual RGBD streams with 3D position.
Isaac-ROS volumetric mapping with TSDF + ESDF for navigation.
Remember-and-reason navigation across long mission horizons.
Natural-language robot control via on-device LLM + ROS2.
Real-time stereo + IMU SLAM for indoor and drone navigation.
Pull a foundation walking policy, fine-tune in sim, deploy to G1.
Unitree Go2 with RTABMap SLAM, Nav2, and live RGBD streaming.
Cluttered-bin manipulation with F/T wrist sensor on Max.
Neural Radiance Fields for environment capture, optimized for Jetson.
Vision-language reasoning over a simulated robotics scene.
OpenBrain is MIT-licensed. Open a PR with your demo and we'll feature it here. Inspired by the Jetson AI Lab community model.
Contribute on GitHub