February, 2024

Extended Kalman Filter SLAM

Implementation of kinematics, odometry, simulation and EKF for SLAM from scratch.

slam
ros2
c++
rviz2
turtlebot
lidar
extended kalman filter
probability
git
simulation

Overview

Implementation of an Extended Kalman Filter for Simultaneous Localization and Mapping in ROS2 and C++ from scratch. This project was carried out in the span of 10 weeks.

This video shows the implementation working in real life. The visualization shows 2 robots, a blue turtlebot which represents the odometry estimation and the green one which is the estimation based on the EKF. The path corresponding to each robot is also shown. Landmarks are represented by cylinders. The green landmarks are part of the EKF estimation. These are extracted from the LiDAR scans, which are shown in red. There's some latency due to data transmission over a wireless network.

This project's source code is available on



Kinematics & Control

A C++ library was implemented with helper functions for planar body motion, including transforms and geometric operations. A DiffDrive class implements the forward kinematics and odometry for a differential drive robot.

The Extended Kalman Filter

The Extended Kalman Filter allows us to get an estimation of the environment (robot + landmarks locations). Our system is nonlinear, so by using the EKF we linearize the state transition model and propagate uncertainty using the linearization as well.

Briefly, the steps are:

  • Update the estimate using the motion model
  • Propagate the uncertainty using linearized state transition model
  • Perform an update to our estimate by using measurements. This is where the LiDAR sensor and landmark detection are used. Using each measurement, we refine the state.
  • Landmark detection

    The landmark detection pipeline consists of multi-step processing of the raw lidar data.

    • Clustering the lidar scan points
    • Filtering the clusters (by # of points in the cluster)
    • Classifying each cluster as circle/not circle.
    • Fitting circles to the clusters that made it through.
    • Filtering the results based on our assumptions on the environment and landmarks. E.g. allowable radius of landmark.

    Simulation


    This video shows the implementation in simulation. The visualization shows 3 robots, a blue turtlebot which represents the odometry estimation, a red one which is the ground truth in the implemented simulation, and the green one which is the estimation based on the EKF. The path corresponding to each robot is also shown.

    A full simulation environment was implemented as a ROS2 node. The simulation performs the forward kinematics based on velocity commands, and moves the robot accordingly. Noise & wheel slip are introduced sampling from Gaussian distributions for a more realistic environment. The environment provides configurable landmarks (cylinders) and walls. A simulated lidar sensor generates measurements based on the robot’s location in the environment, just like its real counterpart.

    A list of features of the simulation environment follows:

  • Landmarks & walls: the environment is enclosed in 4 simulated walls. Configurable cylindrical landmarks allow the user to test the performance on different scenarios.
  • Simulated LiDAR measurements: the EKF relies on measurements extensively. Having implemented the simulated measurements, it is possible to fully develop and test the EKF in simulation. The generated measurements include Gaussian noise.
  • Collisions: the robot can collide with landmarks, the wheels will slip and the robot will glide tangential to the landmark it collided with. This is helpful for creating a large difference between the odometry estimate and the real robot location. Also, real robots do crash!
  • Noise & wheel slip: Besides noise on the simulated LiDAR, real wheels on the real world slip, motors do not respond ideally, so adding random wheel slip makes the environment more realistic.
  • Real world test

    The simulation can only get us so far, especially if we don’t have accurate models of our sensor noise, slip, and motor responses. Getting the filter working on the real robot required a lot of tuning, especially the noise parameters in the EKF and parameters in the landmark association.

    This post’s main video shows the algorithm working on a real Turtlebot 3 burger. The video shows the data being fed back to a PC and shown in Rviz2. The robot is driven around a course, and back to its initial position. The wheel odometry is pretty far from the actual position, while the SLAM estimation provides a better idea of where the robot is. The course is pretty simple, and carton walls were added due to the cluttered environment that produced many landmarks (which can be as bad as good!).

    Qualitative Results

    Driving the robot around the course and taking it back to its initial pose (eye-measuring using a mark on the floor, shown in the main video).

    Final SLAM estimation (x,y,theta): (-0.02, 0.01, -0.06)
    Final odometry estimation (x,y,theta): (0.02, 0.24, -0.67)