Sensor Fusion with EKF
Combine multiple sensors for superior state estimation
Tutorial Overview
🎯 Learning Objectives
By the end of this tutorial, you will:
✅ Understand why sensor fusion is necessary
✅ Learn Extended Kalman Filter (EKF) principles
✅ Configure robot_localization package
✅ Fuse wheel odometry with IMU data
✅ Compare fused vs. unfused odometry
✅ Tune EKF parameters for optimal performance
✅ Diagnose fusion issues
✅ Understand covariance and uncertainty
⏱️ Time Required
Reading & Theory: 30 minutes
Configuration: 20 minutes
Testing & Comparison: 35 minutes
Parameter Tuning: 25 minutes
Advanced Topics: 20 minutes
Total: ~130 minutes
📚 Prerequisites
✅ Completed Sensor Data Visualization
✅ Completed IMU Signal Processing
✅ Understanding of wheel odometry drift
✅ Understanding of IMU characteristics
✅ Can visualize data in RViz
✅ Can record and analyze rosbags
🛠️ What You'll Need
✅ Beetlebot (powered, all sensors working)
✅ Laptop with ROS2 Jazzy
✅ Wireless controller
✅ Open space (5m × 5m minimum)
✅ Measuring tape or marked floor
✅ Protractor (optional, for angle validation)
Part 1: Why Sensor Fusion?
The Problem with Individual Sensors
Wheel Odometry Alone:
✅ Strengths:
Accurate short-term (seconds to minutes)
Provides position (x, y) and orientation (yaw)
High update rate (20 Hz)
Not affected by dynamics
❌ Weaknesses:
Drifts over time (wheel slip, encoder errors)
Linear drift: ±3-5% of distance traveled
Angular drift: ±5-10° per full rotation
Worse on slippery surfaces
Turns accumulate most error
IMU Alone:
✅ Strengths:
Very high update rate (100 Hz)
Accurate short-term angular velocity
Not affected by wheel slip
Detects tilt and acceleration
❌ Weaknesses:
Cannot measure position directly (only acceleration)
Gyro drifts over time (integration error)
Accelerometer noisy in dynamic motion
Can't distinguish tilt from acceleration
No absolute reference (unless magnetometer added)
The Solution: Sensor Fusion
Combine strengths, compensate for weaknesses!
Key insight:
Wheels provide position (but drift)
IMU provides motion dynamics (but drifts differently)
Fusion: Use IMU to correct wheel slip, wheels to bound IMU drift
Example scenario:
Real-World Benefits
Without fusion:
With fusion:
~60% error reduction is typical!
Part 2: Extended Kalman Filter Basics
What is a Kalman Filter?
Simple explanation:
Imagine you're trying to find your location:
Your phone GPS says: "You're at (10.5, 20.3)"
Your step counter says: "You walked 5 meters north"
Which to believe?
Kalman Filter:
Weighs both based on uncertainty
GPS accurate to ±5m? Less trust
Step counter accurate to ±0.5m? More trust
Optimal combination: Weighted average
EKF Algorithm (Simplified)
Two-step process:
1. Prediction Step (Motion Model)
2. Update Step (Measurement)
Repeat at high frequency (20-50 Hz)
State Vector
What does EKF track?
For Beetlebot:
robot_localization tracks all of these simultaneously!
Covariance Matrix
Uncertainty representation:
Visualized in RViz as ellipses:
Small ellipse = high certainty
Large ellipse = low certainty
Part 3: Beetlebot's EKF Configuration
Already Running!
Your robot has EKF pre-configured:
Input topics:
/odom- Wheel odometry (position, velocity)/imu/data- IMU (orientation, angular velocity, linear acceleration)
Output topic:
/odometry/filtered- Fused estimate (best of both!)
View Current Configuration
Key sections:
Understanding the Configuration
What each sensor contributes:
Wheel Odometry (/odom):
✅ x position (forward/back)
✅ y position (left/right)
✅ yaw (heading)
✅ vx (forward velocity)
✅ vy (sideways velocity)
✅ ωz (turn rate)
❌ z, roll, pitch (not measured by wheels)
IMU (/imu/data):
✅ roll (tilt side-to-side)
✅ pitch (tilt forward/back)
✅ yaw (heading - from integrating ωz)
✅ ωx, ωy, ωz (all rotation rates)
✅ ax, ay, az (accelerations)
❌ x, y, z position (can't measure absolute position)
Fused output (/odometry/filtered):
✅ All of the above, optimally combined!
Part 4: Comparing Unfused vs Fused
Exercise 7.1: Straight Line Test
Task: Measure drift with and without fusion
Setup:
Test procedure:
Analysis:
Exercise 7.2: Square Path Test
Task: Classic odometry test - return to start
Setup:
Test procedure:
Analysis:
Typical results:
Exercise 7.3: Rotation Test
Task: Spin 360° and check heading drift
Test procedure:
Expected results:
Part 5: Visualizing Uncertainty
Covariance Ellipses in RViz
Setup RViz to show uncertainty:
[PLACEHOLDER: Screenshot of RViz showing covariance ellipses]
Interpreting Covariance
What the ellipses mean:
Small ellipse:
High confidence in position
Typical at start or after good measurements
Large ellipse:
Lower confidence
Grows during motion (prediction step)
Shrinks when measurements arrive (update step)
Ellipse shape:
Circular: Equal uncertainty in all directions
Elongated: More uncertain in one direction (e.g., forward motion)
Exercise 7.4: Watch Uncertainty Grow and Shrink
Task: Observe covariance dynamics
Part 6: Parameter Tuning
When to Tune Parameters
Default configuration works well for most cases!
Consider tuning if:
Unusual surface (carpet, gravel, ice)
Different speeds (very slow or very fast)
Modified robot (different wheels, weight)
Specific application needs (prioritize smoothness vs. responsiveness)
Key Parameters to Tune
Process Noise Covariance (Q matrix)
Represents: "How much do we trust the motion model?"
Higher values = "Motion model is unreliable, trust measurements more" Lower values = "Motion model is reliable, don't overreact to measurements"
Measurement Noise (odom0_relative, imu0_relative)
Exercise 7.5: Tune for Slippery Surface
Scenario: Robot on smooth tile (more wheel slip)
Modification:
Test: Repeat square test on slippery surface, compare error
Exercise 7.6: Tune for High-Speed Operation
Scenario: Robot operating at max speed (more dynamic)
Modification:
Part 7: Troubleshooting Fusion
Problem: Fused Odometry Jumps or Unstable
Symptoms: /odometry/filtered has sudden jumps
Possible causes:
Sensor disagreement (wheels vs IMU say different things)
Poor covariance tuning
Transform (TF) issues
Problem: Fused Estimate Still Drifts
Not eliminated, just reduced! Drift is inherent without external reference.
To further reduce drift:
Add more sensors
GPS (outdoor)
Visual odometry (camera-based)
Laser scan matching (AMCL in next tutorial)
Calibrate sensors better
Wheel radius calibration
IMU bias calibration (see IMU tutorial)
Tune covariances
Trust best sensor more
Problem: EKF Node Crashes
Debug:
Solution: ⚡ Power cycle robot (fixes 90% of issues)
Part 8: Advanced Topics
Two-Stage Fusion (Optional)
Some systems use:
Stage 1: Local fusion (robot frame)
Fuse wheel + IMU
Output:
/odometry/local
Stage 2: Global fusion (world frame)
Fuse local + GPS/map
Output:
/odometry/global
Beetlebot uses single-stage (sufficient for indoor use)
Understanding Frames
Critical frames:
Why two odometry frames?
odom frame:
Robot's local reference
Continuous, smooth
Drifts over time
Used by EKF
map frame:
Global reference
Corrected by SLAM/localization
Can jump (when loop closure)
Used by navigation (next tutorials)
Quaternions vs Euler Angles
ROS2 uses quaternions for orientation:
Convert quaternion → Euler in Python:
Part 9: Real-World Applications
Application 1: Path Following
Accurate odometry essential for following paths:
Application 2: Return to Home
Fusion allows reliable "return to start" behavior:
Application 3: Multi-Robot Coordination
Sharing positions between robots requires accurate odometry:
Part 10: Knowledge Check
Concept Quiz
Why can't IMU provide position directly?
What does the Kalman Filter do?
Why is wheel odometry good short-term but drifts long-term?
What do covariance ellipses represent?
Can sensor fusion eliminate drift completely?
Hands-On Challenge
Task: Quantify fusion improvement
Requirements:
Drive predetermined 10-meter path (complex: straight, turns, curves)
Record /odom and /odometry/filtered
Measure final position error for both
Calculate improvement percentage
Plot both trajectories in RViz
Generate report with screenshots and data
Bonus:
Test on multiple surface types
Compare at different speeds
Intentionally induce wheel slip (wet floor), measure fusion's correction
Part 11: What You've Learned
✅ Congratulations!
You now understand:
Sensor Fusion Fundamentals:
✅ Why fusion needed (complementary sensor strengths)
✅ Kalman Filter principles (prediction + update)
✅ State estimation and covariance
✅ How EKF combines wheel odom + IMU
Practical Skills:
✅ Comparing fused vs unfused odometry
✅ Measuring drift reduction (~40-60%)
✅ Visualizing uncertainty (covariance ellipses)
✅ Configuring robot_localization
✅ Tuning EKF parameters
Advanced Concepts:
✅ Process noise vs measurement noise
✅ Reference frames (odom vs map)
✅ Quaternions and orientations
✅ When fusion isn't enough (need SLAM)
Next Steps
🎯 You're Now Ready For:
Immediate Next: → SLAM Mapping - Create maps to provide absolute reference
Advanced Navigation: → Localization - Use known map to eliminate drift → Autonomous Navigation - Navigate with fused odometry
Research Topics:
Multi-sensor fusion (add cameras, GPS)
Adaptive filtering (change parameters dynamically)
Fault detection (identify sensor failures)
Quick Reference
Essential Fusion Commands
Typical Error Reductions
5m straight
±3-5%
±2-3%
~40%
2m square
15-20cm
6-10cm
~50%
360° rotation
±5-8°
±2-3°
~60%
10m complex path
20-30cm
8-15cm
~55%
Completed Sensor Fusion with EKF! 🎉
→ Continue to SLAM Mapping → Or return to Tutorial Index
Last Updated: January 2026 Tutorial 7 of 11 - Advanced Level Estimated completion time: 130 minutes
Last updated