📚 Module Reference
Browse our complete library of 57 documentation modules. Each module is a reusable piece of documentation that can be combined into procedures and custom guides.
Showing 57 modules
Build the `obj_detect` Package
Build the `obj_detect` package in your ROS 2 workspace so you can run the object detection node.
Compile and flash the Teensy firmware
Build and upload the microcontroller firmware for closed‑loop control.
Compile the YOLOv11n Model to a Hailo HEF
Use the Hailo Model Zoo compiler to convert your fine‑tuned ONNX model into a Hailo Executable File (.hef) for deployment on the Raspberry Pi AI Kit.
Configure network settings on the Raspberry Pi
Define a static IP address and Wi‑Fi credentials using Netplan.
Configure Raspberry Pi host settings
Set the hostname and ensure hostname preservation on reboot.
Configure the ROSÂ 2 Discovery Server
Start a Fast DDS discovery server on the host, configure the micro‑ROS agent to connect to it and verify discovery connectivity.
Connect to the Raspberry Pi
Discover the robot’s IP address and connect via SSH.
Convert Label Studio Annotations to YOLO Format
Convert exported Label Studio JSON annotations into YOLO‑style text files and create train/validation splits.
Create Hailo YOLO Configuration Files
Define the Hailo compiler configuration, post‑processing script and NMS settings required to compile your YOLOv11n model.
Deploy the Compiled HEF to the Raspberry Pi
Copy the compiled Hailo Executable File (.hef) to your Raspberry Pi and verify it on the Hailo‑8 accelerator.
Diagnose and calibrate the LiDAR
Perform hardware checks, run the standalone node and control the motor.
Diagnose the camera
Verify USB connectivity, test the generic usb_cam node and inspect hardware.
Edit the `data.yaml` File
Update the generated `data.yaml` to reflect the correct class names for your dataset.
Emergency Shutdown
Safely stop the robot in emergency situations by halting software and cutting power if necessary.
Export the Fine‑Tuned Model to ONNX
Convert the best YOLOv11n weights into an ONNX file for Hailo compilation and copy it to another machine if needed.
Fine‑Tune the YOLOv11n Model
Train a YOLOv11n model on your custom dataset using Ultralytics. This step performs the actual fine‑tuning and requires significant compute resour...
Hailo Environment and Compilation Troubleshooting
Resolve common issues when installing the Hailo SDK and compiling models.
Hardware Assembly Guide
Reference resources for assembling the Common Robotics Platform hardware and locating the interactive bill of materials (BOM).
Image Data Annotation with Label Studio
Set up the teleoperation image directory, configure environment variables and launch Label Studio to annotate your teleop image data, then copy ann...
Launch keyboard teleoperation node
Use your keyboard to manually control the robot via evdev_teleop.
Launch robot state publisher and Cartographer
Start the robot state publisher, Cartographer SLAM and occupancy grid nodes.
Launch the camera node
Start the RGB camera driver and publish image data.
Launch the LiDAR node
Start the RPLIDAR driver and publish LaserScan data.
Launch the Object Detection Node
Use the provided launch file to start the object detection node with default parameters. This node runs a YOLOv11n model on your Hailo hardware and...
Launch the Object Detection Node with Custom Parameters
Override the default parameters when launching the object detection node to use a different model file or adjust inference settings.
Monitor SLAM constraint topics
Inspect constraint list and trajectory topics to diagnose SLAM performance.
Monitor Teensy debug output
Connect to the Teensy serial port to view debug information.
Prepare Calibration Images for Hailo
Copy a set of unlabeled images to your Hailo development machine for use during quantization calibration.
Prepare Cartographer launch file
Copy the cartographer_simple.launch.py file into the cartographer_ros package and rebuild.
Prepare the Fine‑Tuning Environment
Set up a Python virtual environment, copy necessary scripts and data, and install Ultralytics to fine‑tune the YOLOv11n model.
Prune the Dataset
Remove images without bounding boxes from the dataset to ensure training only uses labeled data.
Pull the latest common_platform repository updates
Update your local clone of the common_platform repository before building or running any code.
Reboot Procedure
Restart the Raspberry Pi to apply configuration changes or reset the system.
Record images to RAM disk
Build the data_recorder package and record still images.
Record video from the camera
Use the image_view utility to record a video file from the camera topic.
Robot Startup Procedure
Launch the robot’s core systems and verify that topics and nodes are running correctly.
Routine Maintenance
Perform regular checks on sensors, motors and software to ensure reliable operation of the robot.
Run the Object Detection Node Directly
Execute the object detection node directly using `ros2 run` when you want to launch the node without a launch file. You will need to configure par...
RViz Sensor Visualization
Launch RViz 2 to visualize sensor data and configure displays for LiDAR and camera streams.
Save and convert the SLAM map
Finish the current Cartographer trajectory, write the state to disk and convert to a ROS map.
Set ROSÂ 2 name and namespace
Configure the robot’s ROS node namespace to avoid topic collisions.
Set Up the Hailo Development Environment
Install dependencies and the Hailo toolchain in a clean Python environment on an Ubuntu x86_64 machine.
Shutdown Procedure
Properly shut down the robot by stopping all running ROS processes and powering off the Raspberry Pi.
Start autonomous navigation and send a goal
Launch the Nav2 stack, verify planner topics, send a goal and inspect planning outputs.
Start localization using AMCL
Launch the localization stack to estimate the robot’s pose in a known map.
Start the micro‑ROS agent
Run the micro‑ROS serial agent in a Docker container to bridge the Teensy with ROS 2.
Tune Cartographer parameters
Adjust SLAM parameters such as resolution and update rate to improve mapping quality.
Use a Bluetooth Gamepad for Teleoperation
Install the ROS2 joy and teleop packages, create a custom package and launch file to read a Bluetooth gamepad, build the workspace and verify that ...
Verify camera topics
Check that the camera node is streaming images and at the expected frame rate.
Verify LiDAR topics
Check that the LiDAR node is publishing data on the expected topics.
Verify Object Detection Results
Check that the object detection node is running and publishing detection messages, and ensure that the camera input is available.
Verify SLAM topics
Check that Cartographer publishes the expected topics and visualize the TF tree.
Verify Ultralytics Installation
Run a quick training on a toy dataset to confirm that the Ultralytics YOLO toolkit is installed correctly.
View the live camera feed
Use rqt_image_view to display images from the camera topic.
View the robot SLAM visualization
Launch a combined RViz session to inspect the live map and robot state.
Virtual Machine Setup
Prepare a development virtual machine with Ubuntu, ROSÂ 2 and VS Code for coding and testing without physical hardware.
Visualize camera transforms
Use TF tools to inspect the camera’s position relative to the robot base.