Run the calibration:./src/lidar_IMU_calib/calib.shThe options in calib.sh the have the following meaning:bag_path path to the dataset. imu_topic IMU topic. bag_start the relative start time of the rosbag [s]. ba
Official implementation of our paper"GRIL-Calib: Targetless Ground Robot IMU-LiDAR Extrinsic Calibration Method using Ground Plane Motion Constraints". ArXiv :https://arxiv.org/abs/2312.14035 IEEE :https://ieeexplore.ieee.org/document/10506583 ...
Targetless IMU-LiDAR extrinsic calibration methods are gaining significant attention as the importance of the IMU-LiDAR fusion system increases. Notably, existing calibration methods derive calibration parameters under the assumption that the methods require full motion in all axes. When IMU and LiDAR are...
Run the calibration: ./src/lidar_IMU_calib/calib.sh The options incalib.shthe have the following meaning: bag_pathpath to the dataset. imu_topicIMU topic. bag_startthe relative start time of the rosbag [s]. bag_durrthe duration for data association [s]. ...
Figure 3. Our system includes sensors: LiDAR Velodyne Alpha Prime, LadyBug-5 camera, 6 FLIR ADK cameras, LiDAR Ouster-128, LiDAR Ouster-64 and LiDAR Hesai Pandar. Figure 4. Visualization of stitching 360 thermal images and 360 RGB images. Our calibration process is outlined in Figure 5...
Figure 3. Our system includes sensors: LiDAR Velodyne Alpha Prime, LadyBug-5 camera, 6 FLIR ADK cameras, LiDAR Ouster-128, LiDAR Ouster-64 and LiDAR Hesai Pandar. Figure 4. Visualization of stitching 360 thermal images and 360 RGB images. Our calibration process is outlined in Figure 5...
After that, we enhance sensor calibration by removing the effects of ego-motion compensation across LiDARs, RGB cameras, and thermal cameras, which significantly improves the accuracy of targetless calibration methods. To achieve this, we estimate velocity from the LiDAR data, and use this ...