Abstract:
Using multi-sensor fusion technology can make the simultaneous location and mapping (SLAM) system obtain better performance. The traditional Lidar positioning system will produce positioning drift or system failure in scenes with sparse features. For solving the problems above, a parallel multi-sensor fusion system LIGNS based on iterative error state Kalman filter (IESKF1) theory is designed. Different sensors were used to update vehicle state estimation independently and in real time. LIGNS integrated Lidar, IMU and dual antenna GNSS equipment that could additionally provide heading measurement. LIGNS removed the ground point cloud through a two-step filter method, and extracted the features. The features were saved in the slide window to make the feature point cloud denser to deal with the scene with sparse features. The experimental results show that LIGNS can achieve high-precision positioning and mapping, and has better real-time performance.