Tutorial Overview
🎯 Learning Objectives
By the end of this tutorial, you will:
✅ Visualize LiDAR scans in real-time with RViz
✅ View camera images and understand image topics
✅ Monitor IMU data and interpret motion
✅ Track wheel odometry and observe drift
✅ Use rqt tools for multi-sensor visualization
✅ Record sensor data for offline analysis
✅ Understand coordinate frames and sensor fusion basics
✅ Identify sensor issues through visualization
⏱️ Time Required
Reading & Setup: 15 minutes
LiDAR Visualization: 20 minutes
Odometry Analysis: 15 minutes
Multi-Sensor Integration: 20 minutes
📚 Prerequisites
✅ Completed ROS2 Communication & Tools
✅ System setup complete (laptop ↔ robot communication)
✅ Robot powered, connected to WiFi
✅ Can see robot's topics from laptop
✅ RViz installed on laptop
🛠️ What You'll Need
✅ Beetlebot (powered, sensors active)
✅ Open space (3m × 3m minimum)
✅ Various objects/obstacles for testing
✅ Measuring tape (optional, for validation)
Part 1: Understanding the Sensor Suite
Sensors on Beetlebot
Your robot has 5 sensor systems:
RPLidar C1 - 360° laser scanner
Range: 12m max, 8-10m typical indoors
Raspberry Pi Camera V1.3 - Visual perception
Resolution: 5MP (2592×1944), typically 1080p/720p
Topics: /pi_camera/image_raw, /pi_camera/camera_info
LSM6DSRTR IMU - Motion sensing
6-axis: 3-axis accelerometer + 3-axis gyroscope
Topics: /imu/data, /imu/data_raw
Wheel Encoders (4×) - Position feedback
Resolution: 3600 ticks/wheel revolution (900 CPR × X4 quadrature)
Topics: /wheel_rpm, /wheel_ticks
Battery Monitor - Power management
Voltage monitoring via ADC
Sensor Coordinate Frames
Every sensor has a position and orientation relative to the robot:
[PLACEHOLDER: Diagram showing sensor positions on robot]
Frame hierarchy:
Why frames matter:
Sensor data is reported in sensor's own frame
Must transform to common frame for fusion
RViz needs correct frames to display properly
Part 2: LiDAR Visualization
Launch Robot with LiDAR
On robot (via SSH):
Verify LiDAR data:
Visualize LiDAR in RViz
On laptop:
Configure RViz for LiDAR:
Set Fixed Frame:
Left panel: Global Options
Fixed Frame: Change "map" → "base_link"
Add LaserScan Display:
By topic → /scan → LaserScan
Configure LaserScan Display:
Expand LaserScan in left panel
Size (m): 0.05 (point size)
Style: Flat Squares (easier to see)
Color Transformer: Intensity or FlatColor
If Intensity: Color scheme = Rainbow
[PLACEHOLDER: Screenshot of RViz showing LiDAR scan]
What you see:
Red/orange dots = close objects (< 1m)
Green dots = medium distance (1-3m)
Blue dots = far objects (3m+)
Robot center at origin (coordinate axes)
Exercise 4.1: LiDAR Exploration
Task: Understand LiDAR capabilities
Step 1: Range Test
Step 2: Resolution Test
Step 3: Material Test
What you learned:
Angular resolution (~0.5°)
Material reflectivity affects detection
Advanced LiDAR Visualization
Add more information to RViz:
1. Robot Model:
2. TF Frames:
3. LaserScan with Decay:
[PLACEHOLDER: Screenshot showing robot model + LiDAR + TF frames]
Exercise 4.2: Obstacle Detection
Task: Map your surroundings
Setup:
Action:
Observe:
Walls appear as solid lines
Furniture shows as distinct blobs
Moving objects (people) may appear inconsistent
Analysis Questions:
Can you identify room shape from scans?
Are corners clearly defined?
Any areas with no returns? (windows, mirrors)
Part 3: Camera Visualization
Camera Topics Overview
Available topics:
Topic explanations:
/pi_camera/image_raw (sensor_msgs/Image)
Large bandwidth (~30 MB/s at 30fps)
/pi_camera/image_raw/compressed (sensor_msgs/CompressedImage)
Smaller bandwidth (~3 MB/s)
Better for remote viewing
/pi_camera/camera_info (sensor_msgs/CameraInfo)
Lens distortion parameters
View Camera Feed
Method 1: rqt_image_view (Simple)
[PLACEHOLDER: Screenshot of camera view in rqt_image_view]
Method 2: RViz (Integrated)
Benefits of RViz method:
See camera + LiDAR simultaneously
Overlay detection results
Exercise 4.3: Camera Exploration
Task: Understand camera capabilities
Test 1: Field of View
Test 2: Focus Range
Test 3: Lighting Conditions
Check image publishing rate:
Bandwidth usage:
Use compressed for remote viewing!
Part 4: IMU Data Visualization
Available topics:
Difference:
/imu/data_raw: Direct from sensor (noisy)
/imu/data: After filtering and calibration (smoother)
Message contents:
Visualize IMU in RViz
Add IMU display:
What you see:
Arrow shows acceleration vector
Color indicates magnitude
[PLACEHOLDER: Screenshot of IMU visualization in RViz]
Exercise 4.4: IMU Experimentation
Task: Understand IMU measurements
Test 1: Static Robot
Test 2: Forward Acceleration
Test 3: Rotation
Plotting IMU Data with rqt_plot
Real-time graphing:
[PLACEHOLDER: Screenshot of rqt_plot showing IMU data]
Try these patterns:
Forward → Stop (see accel spike)
Turn left → Turn right (see angular velocity)
Figure-8 pattern (see both signals)
Understanding IMU Noise
Compare raw vs filtered:
Why filter?
Raw sensor has electrical noise
Filtering trade-off:
Smoother signal (good for control)
Slight delay (bad for fast response)
Next tutorial (IMU Signal Processing) covers filtering in depth!
Part 5: Wheel Odometry Visualization
Odometry Topics
Available topics:
Key difference:
/odom: Based only on wheel encoders
/odometry/filtered: Wheel encoders + IMU fusion (better!)
Visualize Odometry
Add to RViz:
Configuration:
What you see:
Green arrow = robot's pose (position + orientation)
Ellipse = uncertainty covariance
[PLACEHOLDER: Screenshot of odometry trail in RViz]
Exercise 4.5: Odometry Drift Observation
Task: Measure odometry accuracy
Test 1: Straight Line
Test 2: Square Path
Test 3: Rotation
Comparing Wheel vs Fused Odometry
Plot both simultaneously:
Typically observe:
/odometry/filtered is smoother (IMU helps)
/odom may have small jumps (encoder quantization)
Both drift over time, but fused drifts less
Part 6: Multi-Sensor Visualization
The Complete Perception Picture
Configure RViz with all sensors:
Displays to add:
✅ RobotModel (/robot_description)
✅ Camera (/pi_camera/image_raw)
✅ Odometry (/odometry/filtered)
Save configuration:
[PLACEHOLDER: Screenshot of RViz with all sensors displayed]
Exercise 4.6: Multi-Sensor Correlation
Task: See how sensors relate during motion
Setup:
Test Scenario:
Questions to answer:
Do all sensors agree on motion?
Which sensor updates fastest? (IMU: 100Hz)
Which is most accurate for distance? (LiDAR)
Which gives most information? (Camera: colors, textures)
Part 7: Recording Sensor Data
Why Record Data?
Use cases:
Debugging: Replay issues offline
Analysis: Process data in Python/MATLAB
Sharing: Send to colleagues/support
Comparison: Before/after calibration
Development: Train ML models
Documentation: Demonstrate capabilities
Recording with ros2 bag
Record all sensors:
While recording:
Drive predetermined path (square, circle)
Document what you're testing
Stop recording: Ctrl+C
Analyzing Recorded Data
Get bag info:
Play back data:
While playing back:
RViz visualizes as if robot was live
Can pause, rewind (by replaying)
Exercise 4.7: Data Collection Project
Task: Record and analyze driving patterns
Collect data:
Analyze:
rqt_multiplot - Advanced Plotting
Install (if not already):
Launch:
Create multi-panel plots:
[PLACEHOLDER: Screenshot of rqt_multiplot with 3 panels]
Use case:
Monitor all critical sensors at once
Spot correlations between signals
rqt_robot_monitor - System Health
Shows:
Set up alerts:
Low battery warning (<10.5V)
Custom Monitoring Script
Create battery alert script:
Script content:
Run:
Part 9: Troubleshooting Sensor Issues
LiDAR Not Publishing
Symptoms: /scan topic not appearing
Debug steps:
If still not working: ⚡ Power cycle robot
Camera Not Streaming
Symptoms: No image in rqt_image_view
Debug steps:
If still not working: ⚡ Power cycle robot
IMU Data Frozen
Symptoms: IMU values not changing
Debug steps:
If still not working: ⚡ Power cycle robot
Odometry Looks Wrong
Symptoms: Robot drifting wildly in RViz
Check:
If still not working: ⚡ Power cycle robot
General Sensor Troubleshooting
Most sensor issues fixed by:
⚡ Power cycle robot (90% of issues)
Check topic rates (ros2 topic hz)
Check for error messages (ros2 topic echo /rosout)
Verify nodes running (ros2 node list)
Check network connectivity (WiFi signal strength)
Part 10: Knowledge Check
Which sensor has the highest update rate?
Why does odometry drift over time?
What's the difference between /imu/data and /imu/data_raw?
Why use compressed images instead of raw?
Which sensor is best for measuring distance to obstacles?
Hands-On Challenge
Task: Create sensor validation report
Requirements:
Record 2-minute rosbag with all sensors
Play back and screenshot each sensor view
Create report with:
LiDAR: Max range observed
Camera: Field of view measurement
IMU: Acceleration range during maneuvers
Odometry: Final drift after 5m × 5m square
Deliverable:
Text report with measurements
Part 11: What You've Learned
✅ Congratulations!
You now understand:
Sensor Capabilities:
✅ LiDAR: Range, resolution, materials
✅ Camera: FOV, focus, lighting effects
✅ IMU: Acceleration, rotation sensing
✅ Odometry: Position tracking and drift
Visualization Tools:
✅ RViz configuration for all sensors
✅ rqt_image_view for camera
✅ rqt_plot for time-series data
Data Management:
Troubleshooting:
✅ Diagnosing sensor issues
✅ Common problems and solutions
🎯 You're Now Ready For:
Immediate Next: → IMU Signal Processing - Filter noisy IMU data, extract orientation
Sensor Applications: → Teleoperation Control - Use sensors for feedback
→ SLAM Mapping - Build maps with LiDAR
→ Autonomous Navigation - Sensor-based obstacle avoidance
Advanced Topics:
Custom sensor integration
Quick Reference
Essential Visualization Commands
Typical Sensor Rates
/pi_camera/image_raw/compressed
Completed Sensor Data Visualization! 🎉
→ Continue to IMU Signal Processing
→ Or return to Tutorial Index
Last Updated: January 2026
Tutorial 4 of 11 - Intermediate Level
Estimated completion time: 90 minutes