Have a personal or library account? Click to login
Experimental Validation: Perception and Localization Systems for Autonomous Vehicles using the Extended Kalman Filter Algorithm Cover

Experimental Validation: Perception and Localization Systems for Autonomous Vehicles using the Extended Kalman Filter Algorithm

Open Access
|Feb 2024

Figures & Tables

Figure 1:

The hardware wiring and schematics.
The hardware wiring and schematics.

Figure 2:

The physical look of a robot.
The physical look of a robot.

Figure 3:

The scenario of physical world and measurement.
The scenario of physical world and measurement.

Figure 4:

Encoders' schematic diagram.
Encoders' schematic diagram.

Figure 5:

The Localization and perception block diagram. ROS, robot operating system.
The Localization and perception block diagram. ROS, robot operating system.

Figure 6:

The map generated using the Hector-SLAM algorithm. SLAM, Simultaneous localization and mapping.
The map generated using the Hector-SLAM algorithm. SLAM, Simultaneous localization and mapping.

Figure 7:

Comparison of EKF position estimation results with references. During the experiment, we focus on two coordinates as the mobile robot moves—namely, the X and Y coordinates. Label A represents the robot's motion along the X coordinate, while label B corresponds to its movement along the Y coordinate. EKF, extended Kalman filter.
Comparison of EKF position estimation results with references. During the experiment, we focus on two coordinates as the mobile robot moves—namely, the X and Y coordinates. Label A represents the robot's motion along the X coordinate, while label B corresponds to its movement along the Y coordinate. EKF, extended Kalman filter.

Figure 8:

RViz visualization.
RViz visualization.

The EKF parameter details_

ParameterInformation
β^k \hat \beta_k^- The estimated state
fNon-linear model system
hMeasurement model system
vkMeasurement noise
zkIMU measurement
Wk − 1Dynamic system noise
β^k1+ \hat \beta_{k - 1}^+ The estimated updates state
VThe linear velocity of the robot
ΔtTime derivative
θThe steering angle of the robot
QkDynamic system noise matrix Q=[10400000104000001040000010400000104] Q = \left[ {\matrix{{{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill \cr}} \right]
HkIMU measurement matrix Hodometry+IMU=[1000001000001000001000001] {H_{{\rm{odometry+}}IMU}} = \left[ {\matrix{1 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 1 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill \cr}} \right]
RkMeasurement noise matrix R=[10400000104000001040000010400000104] R = \left[ {\matrix{{{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {{{10}^{- 4}}} \hfill \cr}} \right]
FkJacobians matrix, Fk=[10Vk1ΔtsinθkΔtcosθk001Vk1ΔtcosθkΔtsinθk00010Δt0001000001] {F_k} = \left[ {\matrix{1 \hfill & 0 \hfill & {- {V_{k - 1}}\Delta t\sin {\theta_k}} \hfill & {\Delta t\cos {\theta_k}} \hfill & 0 \hfill \cr 0 \hfill & 1 \hfill & {{V_{k - 1}}\Delta t\cos {\theta_k}} \hfill & {\Delta t\sin {\theta_k}} \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill & {\Delta t} \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill & 0 \hfill \cr 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & 1 \hfill \cr}} \right]

The steps of the EKF-SLAM Algorithm_

EKF-SLAM Algorithm
  • The initial step is to initialize the previous estimated value (b_prev) and the previous covariance error (p_prev) with a value of 0.

  • Initialize the predicted state (b_new) based on Eq. (1)

  • Initialize the prediction error covariance (p_new) based on Eq. (2)

  • Obtain the optimal gain (K) based on Eq. (3)

  • Obtain an estimate of the state of the update (b) based on Eq. (4)

  • The estimation of the state of the update is the result of the KF displayed

  • Get the updated covariance error (K) based on Eq. (5)

  • The updated state estimate value is stored as the previous estimate, and then the state estimation algorithm returns to step 2

  • The error covariance update value is stored as the prior covariance, and then the error covariance algorithm returns to step 3.

The comparison of the encoder results and the reference in X-axis_

X position from odometryX referenceError
0.950.970.02
0.950.970.03
0.950.970.03
0.950.970.03
0.950.980.03
0.950.980.04
0.950.980.04
0.950.990.04
0.950.990.04
0.951.000.05

Error in average (%)3

The details of robot parameters_

ParameterValue
Battery12.6 V (DC)
Dimension242.9 × 192.2 mm
Steering servo9 kg/cm torque
Wheel motor240 RPM
Wireless2.4G/5G dual-band WIFI, Bluetooth 4.2
Driving typeAckerman steering dual gearmotor rear wheel drive

Comparison between the EKF positions and the reference_

X position (EKF)X referenceY position (EKF)Y referenceError XError Y
0.970.970.020.020.000.00
0.980.970.020.020.010.00
0.980.970.020.020.010.01
0.980.970.020.020.010.01
0.990.980.020.020.010.02
0.990.980.020.020.010.04
1.000.980.020.020.020.04
1.010.990.020.020.020.04
1.010.990.020.020.020.04
1.021.000.020.020.020.05

Error in average (%)13

RMSE0.110.15

Accelerometer measurements_

XYX referenceY referenceX errorY error
0.02−0.02000.02−0.02
0.01−0.01000.01−0.01
−0.03−0.0200−0.03−0.02
−0.06−0.0200−0.06−0.02
−0.07−0.0200−0.07−0.02
−0.060.0100−0.060.01
−0.06−0.0300−0.06−0.03
0.010.01000.010.01
−0.010.0000−0.010.00
−0.06−0.0200−0.06−0.02

Error in average (%)42

The maximum and minimum distance values of the LiDAR data_

Maximum (m)Minimum (m)
Measured data6.8080.15
Actual value (reference)6.540.09

Error (%)467

Gyroscope orientation measurements_

X orientationY orientationX referenceY referenceX errorY error
−0.01−0.0100−0.01−0.01
−0.01−0.0100−0.01−0.01
−0.010.0200−0.010.02
−0.010.0300−0.010.03
−0.010.0300−0.010.03
0.000.03000.000.03
−0.010.0300−0.010.03
0.000.00000.000.00
0.000.00000.000.00
−0.010.0300−0.010.03

Errors in average (%)12

The comparison of the encoder results and the reference in Y-axis_

Y position from odometryY referenceError
0.020.020.00
0.020.020.00
0.020.020.00
0.020.020.00
0.020.020.00
0.010.020.01
0.010.020.01
0.020.020.02
0.070.020.05
0.070.020.05

Error in average (%)2
Language: English
Submitted on: Aug 28, 2023
|
Published on: Feb 7, 2024
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Bambang Lelono Widjiantoro, Katherin Indriawati, T. S. N. Alexander Buyung, Kadek Dwi Wahyuadnyana, published by Professor Subhas Chandra Mukhopadhyay
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.