Sensor Data Visualization
Understand your robot's perception through real-time data
Tutorial Overview
🎯 Learning Objectives
By the end of this tutorial, you will:
✅ Visualize LiDAR scans in real-time with RViz
✅ View camera images and understand image topics
✅ Monitor IMU data and interpret motion
✅ Track wheel odometry and observe drift
✅ Use rqt tools for multi-sensor visualization
✅ Record sensor data for offline analysis
✅ Understand coordinate frames and sensor fusion basics
✅ Identify sensor issues through visualization
⏱️ Time Required
Reading & Setup: 15 minutes
LiDAR Visualization: 20 minutes
Camera & IMU: 20 minutes
Odometry Analysis: 15 minutes
Multi-Sensor Integration: 20 minutes
Total: ~90 minutes
📚 Prerequisites
✅ Completed ROS2 Communication & Tools
✅ System setup complete (laptop ↔ robot communication)
✅ Robot powered, connected to WiFi
✅ Can see robot's topics from laptop
✅ RViz installed on laptop
🛠️ What You'll Need
✅ Beetlebot (powered, sensors active)
✅ Laptop on same network
✅ Wireless controller
✅ Open space (3m × 3m minimum)
✅ Various objects/obstacles for testing
✅ Measuring tape (optional, for validation)
Part 1: Understanding the Sensor Suite
Sensors on Beetlebot
Your robot has 5 sensor systems:
RPLidar C1 - 360° laser scanner
Range: 12m max, 8-10m typical indoors
Update rate: 10 Hz
Topic:
/scan
Raspberry Pi Camera V1.3 - Visual perception
Resolution: 5MP (2592×1944), typically 1080p/720p
Update rate: 30 fps
Topics:
/pi_camera/image_raw,/pi_camera/camera_info
LSM6DSRTR IMU - Motion sensing
6-axis: 3-axis accelerometer + 3-axis gyroscope
Update rate: 100 Hz
Topics:
/imu/data,/imu/data_raw
Wheel Encoders (4×) - Position feedback
Resolution: 1800 ticks/revolution
Update rate: 20 Hz
Topics:
/wheel_rpm,/wheel_ticks
Battery Monitor - Power management
Voltage monitoring via ADC
Update rate: 10 Hz
Topic:
/battery_voltage
Sensor Coordinate Frames
Every sensor has a position and orientation relative to the robot:
[PLACEHOLDER: Diagram showing sensor positions on robot]
Frame hierarchy:
Why frames matter:
Sensor data is reported in sensor's own frame
Must transform to common frame for fusion
RViz needs correct frames to display properly
Part 2: LiDAR Visualization
Launch Robot with LiDAR
On robot (via SSH):
Verify LiDAR data:
Visualize LiDAR in RViz
On laptop:
Configure RViz for LiDAR:
Set Fixed Frame:
Left panel: Global Options
Fixed Frame: Change "map" → "base_link"
Add LaserScan Display:
Bottom left: Click "Add"
By topic →
/scan→ LaserScanClick OK
Configure LaserScan Display:
Expand LaserScan in left panel
Size (m): 0.05 (point size)
Style: Flat Squares (easier to see)
Color Transformer: Intensity or FlatColor
If Intensity: Color scheme = Rainbow
[PLACEHOLDER: Screenshot of RViz showing LiDAR scan]
What you see:
Red/orange dots = close objects (< 1m)
Green dots = medium distance (1-3m)
Blue dots = far objects (3m+)
Robot center at origin (coordinate axes)
Exercise 4.1: LiDAR Exploration
Task: Understand LiDAR capabilities
Step 1: Range Test
Step 2: Resolution Test
Step 3: Material Test
What you learned:
LiDAR range limitations
Angular resolution (~0.5°)
Material reflectivity affects detection
Advanced LiDAR Visualization
Add more information to RViz:
1. Robot Model:
2. TF Frames:
3. LaserScan with Decay:
[PLACEHOLDER: Screenshot showing robot model + LiDAR + TF frames]
Exercise 4.2: Obstacle Detection
Task: Map your surroundings
Setup:
Action:
Observe:
Walls appear as solid lines
Furniture shows as distinct blobs
Doorways appear as gaps
Moving objects (people) may appear inconsistent
Analysis Questions:
Can you identify room shape from scans?
Are corners clearly defined?
Any areas with no returns? (windows, mirrors)
Part 3: Camera Visualization
Camera Topics Overview
Available topics:
Topic explanations:
/pi_camera/image_raw (sensor_msgs/Image)
Uncompressed image data
Large bandwidth (~30 MB/s at 30fps)
Best quality
Use on local network
/pi_camera/image_raw/compressed (sensor_msgs/CompressedImage)
JPEG compressed
Smaller bandwidth (~3 MB/s)
Slight quality loss
Better for remote viewing
/pi_camera/camera_info (sensor_msgs/CameraInfo)
Camera calibration data
Lens distortion parameters
Used for 3D perception
View Camera Feed
Method 1: rqt_image_view (Simple)
[PLACEHOLDER: Screenshot of camera view in rqt_image_view]
Method 2: RViz (Integrated)
Benefits of RViz method:
See camera + LiDAR simultaneously
Overlay detection results
3D context visible
Exercise 4.3: Camera Exploration
Task: Understand camera capabilities
Test 1: Field of View
Test 2: Focus Range
Test 3: Lighting Conditions
Image Topic Performance
Check image publishing rate:
Bandwidth usage:
Use compressed for remote viewing!
Part 4: IMU Data Visualization
IMU Topics
Available topics:
Difference:
/imu/data_raw: Direct from sensor (noisy)/imu/data: After filtering and calibration (smoother)
Message contents:
Visualize IMU in RViz
Add IMU display:
What you see:
Arrow shows acceleration vector
Box shows orientation
Color indicates magnitude
[PLACEHOLDER: Screenshot of IMU visualization in RViz]
Exercise 4.4: IMU Experimentation
Task: Understand IMU measurements
Test 1: Static Robot
Test 2: Forward Acceleration
Test 3: Rotation
Plotting IMU Data with rqt_plot
Real-time graphing:
[PLACEHOLDER: Screenshot of rqt_plot showing IMU data]
Try these patterns:
Forward → Stop (see accel spike)
Turn left → Turn right (see angular velocity)
Figure-8 pattern (see both signals)
Understanding IMU Noise
Compare raw vs filtered:
Why filter?
Raw sensor has electrical noise
Quantization errors
Temperature drift
Vibrations from motors
Filtering trade-off:
Smoother signal (good for control)
Slight delay (bad for fast response)
Next tutorial (IMU Signal Processing) covers filtering in depth!
Part 5: Wheel Odometry Visualization
Odometry Topics
Available topics:
Key difference:
/odom: Based only on wheel encoders/odometry/filtered: Wheel encoders + IMU fusion (better!)
Visualize Odometry
Add to RViz:
Configuration:
What you see:
Green arrow = robot's pose (position + orientation)
Ellipse = uncertainty covariance
Trail = path history
[PLACEHOLDER: Screenshot of odometry trail in RViz]
Exercise 4.5: Odometry Drift Observation
Task: Measure odometry accuracy
Test 1: Straight Line
Test 2: Square Path
Test 3: Rotation
Comparing Wheel vs Fused Odometry
Plot both simultaneously:
Typically observe:
/odometry/filteredis smoother (IMU helps)/odommay have small jumps (encoder quantization)Both drift over time, but fused drifts less
Part 6: Multi-Sensor Visualization
The Complete Perception Picture
Configure RViz with all sensors:
Displays to add:
✅ RobotModel (
/robot_description)✅ LaserScan (
/scan)✅ Camera (
/pi_camera/image_raw)✅ Imu (
/imu/data)✅ Odometry (
/odometry/filtered)✅ TF (coordinate frames)
Save configuration:
[PLACEHOLDER: Screenshot of RViz with all sensors displayed]
Exercise 4.6: Multi-Sensor Correlation
Task: See how sensors relate during motion
Setup:
Test Scenario:
Questions to answer:
Do all sensors agree on motion?
Which sensor updates fastest? (IMU: 100Hz)
Which is most accurate for distance? (LiDAR)
Which gives most information? (Camera: colors, textures)
Part 7: Recording Sensor Data
Why Record Data?
Use cases:
Debugging: Replay issues offline
Analysis: Process data in Python/MATLAB
Sharing: Send to colleagues/support
Comparison: Before/after calibration
Development: Train ML models
Documentation: Demonstrate capabilities
Recording with ros2 bag
Record all sensors:
While recording:
Drive predetermined path (square, circle)
Test various speeds
Include static periods
Document what you're testing
Stop recording: Ctrl+C
Analyzing Recorded Data
Get bag info:
Play back data:
While playing back:
RViz visualizes as if robot was live
Can pause, rewind (by replaying)
No robot needed!
Exercise 4.7: Data Collection Project
Task: Record and analyze driving patterns
Collect data:
Analyze:
Part 8: Diagnostic Tools
rqt_multiplot - Advanced Plotting
Install (if not already):
Launch:
Create multi-panel plots:
[PLACEHOLDER: Screenshot of rqt_multiplot with 3 panels]
Use case:
Monitor all critical sensors at once
Spot correlations between signals
Detect anomalies
rqt_robot_monitor - System Health
Shows:
Battery status
Motor health
Sensor connectivity
System warnings
Set up alerts:
Low battery warning (<10.5V)
Sensor timeout warnings
Motor fault detection
Custom Monitoring Script
Create battery alert script:
Script content:
Run:
Part 9: Troubleshooting Sensor Issues
LiDAR Not Publishing
Symptoms: /scan topic not appearing
Debug steps:
If still not working: ⚡ Power cycle robot
Camera Not Streaming
Symptoms: No image in rqt_image_view
Debug steps:
If still not working: ⚡ Power cycle robot
IMU Data Frozen
Symptoms: IMU values not changing
Debug steps:
If still not working: ⚡ Power cycle robot
Odometry Looks Wrong
Symptoms: Robot drifting wildly in RViz
Check:
If still not working: ⚡ Power cycle robot
General Sensor Troubleshooting
Most sensor issues fixed by:
⚡ Power cycle robot (90% of issues)
Check topic rates (
ros2 topic hz)Check for error messages (
ros2 topic echo /rosout)Verify nodes running (
ros2 node list)Check network connectivity (WiFi signal strength)
Part 10: Knowledge Check
Concept Quiz
Which sensor has the highest update rate?
Why does odometry drift over time?
What's the difference between /imu/data and /imu/data_raw?
Why use compressed images instead of raw?
Which sensor is best for measuring distance to obstacles?
Hands-On Challenge
Task: Create sensor validation report
Requirements:
Record 2-minute rosbag with all sensors
Play back and screenshot each sensor view
Create report with:
LiDAR: Max range observed
Camera: Field of view measurement
IMU: Acceleration range during maneuvers
Odometry: Final drift after 5m × 5m square
Deliverable:
Rosbag file
Screenshots (4-5 images)
Text report with measurements
Part 11: What You've Learned
✅ Congratulations!
You now understand:
Sensor Capabilities:
✅ LiDAR: Range, resolution, materials
✅ Camera: FOV, focus, lighting effects
✅ IMU: Acceleration, rotation sensing
✅ Odometry: Position tracking and drift
Visualization Tools:
✅ RViz configuration for all sensors
✅ rqt_image_view for camera
✅ rqt_plot for time-series data
✅ Multi-sensor displays
Data Management:
✅ Recording rosbags
✅ Playing back data
✅ Offline analysis
✅ Sharing sensor data
Troubleshooting:
✅ Diagnosing sensor issues
✅ Checking topic health
✅ Common problems and solutions
Next Steps
🎯 You're Now Ready For:
Immediate Next: → IMU Signal Processing - Filter noisy IMU data, extract orientation
Sensor Applications: → Teleoperation Control - Use sensors for feedback → SLAM Mapping - Build maps with LiDAR → Autonomous Navigation - Sensor-based obstacle avoidance
Advanced Topics:
Sensor calibration
Multi-sensor fusion
Custom sensor integration
Quick Reference
Essential Visualization Commands
Typical Sensor Rates
LiDAR
/scan
10 Hz
~50 KB/s
Camera (raw)
/pi_camera/image_raw
30 Hz
30-40 MB/s
Camera (compressed)
/pi_camera/image_raw/compressed
30 Hz
2-5 MB/s
IMU
/imu/data
100 Hz
~10 KB/s
Odometry
/odometry/filtered
20 Hz
~5 KB/s
Encoders
/wheel_rpm
20 Hz
~2 KB/s
Battery
/battery_voltage
10 Hz
<1 KB/s
Completed Sensor Data Visualization! 🎉
→ Continue to IMU Signal Processing → Or return to Tutorial Index
Last Updated: January 2026 Tutorial 4 of 11 - Intermediate Level Estimated completion time: 90 minutes
Last updated