LiDAR Inertial Navigation System

What is LINS?

Full name of LINS is LiDAR Inertial Navigation System. This toolkit is created by me while I am working for IAT Co., Ltd.

LINS designed for mechanical rotation multi-laser array(e.g. Velodyne VLP-16) and low-cost IMU(e.g. ADIS16488). GPS is optional which acts as global constraint provider in Mapping scenario and offer initial position in Localization scenario.

How does LINS work?

LINS has two work mode: Mapping mode and Localization mode. The data sources for LINS are synchronized LiDAR pointcloud, IMU Raw datastream, GPS position(optional). Our device use hardware level synchronization solution which designed with FPGA chip. All data are stamped by the same time reference.

Mapping

On mapping mode, LINS would take IMU to compensate motion of vehicle to solve rectified LiDAR Pointcloud frame. Rectified pointcloud then be used to estimate vehicle state. Interactively, the estimation result will be used to refine motion compensation result. State estimation algorithm was inspired by LOAM1 which was developed by Dr. Zhang who is a Ph.D from CMU.

After build several trajectories, they would be merged with global constraints(if GPS is available) and relative constraints(loop closure constraint, within or between trajectories) and then generate Point cloud map for localization. Here are some videos shows some different dataset processed by LINS.

Mapping @ Indoor

Motion Compensation(red: raw pointcloud; green: compensated)

Mapping @ SZU(S.C.)

Mapping @ SUSTC

Multiscan routes merging

Localization

On localization mode, LINS need start point to init the system. Currently, we used dual-antenna GPS to get an initial position and yaw angle. This part can be substitude by our approach such as A-GPS, Cell Network, manual input, WiFi fingerprint, etc.

The compensator is also applied to pointcloud frame to eliminate motion distortion. With start point, LINS would call known environmental data from Point Map on demand. Tracking algorithm would continuously keep the vehicle consistent with map.

In this section, Multi sensor fusion EKF2 (Thanks to ETHZ ASL Lab great work!) is used to sythesize different source of localization information to get optimal state. This fusion module make system more robust to failure or single sensor. Moreover, if some more sensor attached to system, like visual slam, would be easy to integrate to LINS. And a very important advantage is that fusion filter could offer high frequency pose output as same as IMU frequency. Very critical to high speed vehicle localizatoin.

The video below shows localization algorithm worked in campus of SZU.

LOCALIZATION @ SZU(N.C.)

How LINS doing on road?

Want more fancy videos that how the LINS work in real scenario? Look these and have fun.

LINS on Electric Car

LINS turning on road

LINS run at 100km/hs

1. Zhang, Ji, and Sanjiv Singh. “Low-drift and real-time lidar odometry and mapping.” Autonomous Robots 41.2 (2017): 401-416.
2. https://github.com/ethz-asl/ethzasl_msf . I have a modified version which does not depend on ROS libs, https://github.com/sxsong1207/ethzasl_msf_noros