π Welcome to ICCVβ23 LiDAR-Inertial SLAM Challenge! π
In the Lidar-inertial track, we exclusively offer access to high-quality LiDAR-inertial datasets sourced from SubT-MRS and TartanAir. These datasets encompass various challenging conditions such as βmulti-floor,long corridor, self-similar environments and moreβ providing a test from simulation to real-world. For the other two tracks, see here: Visual-Inertial SLAM Challenge and Sensor Fusion SLAM Challenge.
Seize this chance to demonstrate your skills and compete among the finest in the field!
Three separate awards will be given for each track. Join us now to become a vital part of cutting-edge advancements in robotics and sensor fusion! π€π‘ Let your expertise shine in this thrilling competition!
Important Latest Updates:
Hello everyone!
Thanks for participating the SLAM challenge and try our dataset!
The first ICCV SLAM Challenge has concluded. We have summarized the winning teamsβ solutions and provided a detailed analysis of the submitted results. For more insights, please review our Paper.
In response to your requests, we have open-sourced our Ground Truth Trajectory data to facilitate easier evaluation. This will be valuable for your research and benchmarking.
Looking ahead, we will release more dataset with Ground Truth Map. Stay tuned! To stay informed with the latest updates, be sure to sign up for our mailing list
Thank you for your participation and contribution to the challenge! ^^
Feel free to check our Paper and cite our work if it is useful ^ ^.
@InProceedings{Zhao_2024_CVPR,
author = {Zhao, Shibo and Gao, Yuanjun and Wu, Tianhao and Singh, Damanpreet and Jiang, Rushan and Sun, Haoxiang and Sarawata, Mansi and Qiu, Yuheng and Whittaker, Warren and Higgins, Ian and Du, Yi and Su, Shaoshu and Xu, Can and Keller, John and Karhade, Jay and Nogueira, Lucas and Saha, Sourojit and Zhang, Ji and Wang, Wenshan and Wang, Chen and Scherer, Sebastian},
title = {SubT-MRS Dataset: Pushing SLAM Towards All-weather Environments},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2024},
pages = {22647-22657}
}
Previous Updates:
-
To facilitate better algorithm debugging for everyone recently, we allow multiple daily submissions until 20 Sept. Starting from 21 Sept, only one submission per day is allowed.
-
Weβve observed an issue with the extrinsics of three trajectories, i.e., Lidar_factory, Lidar_ocean, and Lidar_sewerage. Please download the calibration file again. We are very sorry for any inconvenience caused.
-
Weβve observed an issue with one trajectory that is released in the LiDAR-inertial track. The length of the Final_Challenge_UGV3 trajectory recorded by ROS bags is about 80 seconds shorter than the same trajectory recorded by the folder-format data. To solve this issue, we have updated the ZIP file containing ROS bags of that trajectory to match the length of the folder-format data. If you have already downloaded that ZIP file, you can avoid downloading lots of Gigabytes again by simply downloading the last two missing ROS bag files here. We are very sorry for any inconvenience caused.
If you have any question, please do not hesitate to post issues on this github website. We would love to hear from your feedback! Every post will be responded with no spared effort within 36 hours.
Please note challenge deadline updated: 15th October 2023 11:59 PM EDT.
Time remaining:
File structure:
rosbag
βββ TartanAir_lidar_{places ...}_noise0.bag
βββ SubT_MRS_{trajectory names ...}_{robot types ...}.zip
βββ (zipped) raw_data_{...}yyyy-mm-dd-hh-mm-ss{...}.bag
folder
βββ TartanAir_lidar_{places ...}.zip
β βββ (zipped) imu
β β βββ [acc/gyro/imu/imu_time].[npy/txt]
β βββ (zipped) lidar
β βββ {...}_lcam_front_lidar.ply
β βββ timestamps.txt
βββ SubT_MRS_{trajectory names ...}_{robot types ...}.zip
βββ (zipped) imu
β βββ imu_data.csv
βββ (zipped) lidar
β βββ {...}.las
β βββ timestamps.txt
βββ (zipped) tf
βββ tf_data.csv
SubT-MRS Datasets
-
Multiple Modalities: Our dataset includes hardware time-synchronized data from 4 RGB cameras, 1 LiDAR, 1 IMU, and 1 thermal camera, providing diverse and precise sensor inputs.
-
Diverse Scenarios: Collected from multiple locations, the dataset exhibits varying environmental setups, encompassing indoors, outdoors, mixed indoor-outdoor, underground, off-road, and buildings, among others.
-
Multi-Degraded: By incorporating multiple sensor modalities and challenging conditions like fog, snow, smoke, and illumination changes, the dataset introduces various levels of sensor degradation.
-
Heterogeneous Kinematic Profiles: The SubT-MRS Dataset uniquely features time-synchronized sensor data from diverse vehicles, including RC cars, legged robots, drones, and handheld devices, each operating within distinct speed ranges.
TartanAir Dataset
This benchmark is based on the TartanAir dataset, which is collected in photo-realistic simulation environments based on the AirSim project. A special goal of this dataset is to focus on the challenging environments with changing light conditions, adverse weather, and dynamic objects. The four most important features of our dataset are:
- Large size diverse realistic data. We collect the data in diverse environments with different styles, covering indoor/outdoor, different weather, different seasons, urban/rural.
- Multimodal ground truth labels. We provide RGB stereo, depth, optical flow, and semantic segmentation images, which facilitates the training and evaluation of various visual SLAM methods.
- Diversity of motion patterns. Our dataset covers much more diverse motion combinations in 3D space, which is significantly more difficult than existing datasets.
- Challenging Scenes. We include challenging scenes with difficult lighting conditions, day-night alternating, low illumination, weather effects (rain, snow, wind and fog) and seasonal changes.Please refer to the TartanAir Dataset and the paper for more information.
Folder structure inside the Tartan Air dataset:
lidar_envname
βββ lidar # LiDAR folder
β βββ timestamps.txt # LiDAR timestamp
β βββ 000000_lcam_front_lidar.png # RGB LiDAR 000000
β βββ 000001_lcam_front_lidar.png # RGB LiDAR 000001
β βββ ... ...
β βββ 000xxx_lcam_front_lidar.png # RGB LiDAR 000xxx
β
βββ imu # IMU folder
βββ acc.npy # IMU acceleration
βββ acc.txt # IMU acceleration
βββ gyro.npy # IMU gyroscope
βββ gyro.txt # IMU gyroscope
βββ imu.npy # IMU acceleration and gyroscope
βββ imu.txt # IMU acceleration and gyroscope
βββ imu_time.npy # IMU timestamp
βββ imu_time.txt # IMU timestamp
Download
ROS bag format:βGoogle Baidu
Folder format:βββΒ Β Google Baidu
Ground Truth:βGoogle Baidu
Name | Source | Location | Robot | Sensor | Description | Trajectory Length(m) | Duration (s) | Video | Calibration (Extrinsics) |
---|---|---|---|---|---|---|---|---|---|
Final_Challenge_UGV1 | SubT-MRS | Β | UGV1 | LiDAR,IMU | Geometry Degraded | 441.86 | 1600 | link | Google Baidu |
Final_Challenge_UGV2 | SubT-MRS | Β | UGV2 | LiDAR,IMU | Geometry Degraded | 493.67 | 3390 | link | Google Baidu |
Final_Challenge_UGV3 | SubT-MRS | Β | UGV3 | LiDAR,IMU | Geometry Degraded | 593.79 | 1714 | link | Google Baidu |
Urban_Challenge_UGV1 | SubT-MRS | Β | UGV1 | LiDAR,IMU | Geometry Degraded | 124.92 | 513 | link | Google Baidu |
Urban_Challenge_UGV2 | SubT-MRS | Β | UGV2 | LiDAR,IMU | Geometry Degraded | 1377.37 | 3120 | link | Google Baidu |
Laurel_Cavern | SubT-MRS | Β | Handheld | LiDAR,IMU | Underground Cave | 490.46 | 960 | link | Google Baidu |
Lidar_factory | TartanAir | Β | Virtual Sensors | LiDAR,IMU | Snow | 640 | 160.7 | Β | Google Baidu |
Lidar_ocean | TartanAir | Β | Virtual Sensors | LiDAR,IMU | Dynamic Objects | 425 | 127.5 | Β | Google Baidu |
Lidar_sewerage | TartanAir | Β | Virtual Sensors | LiDAR,IMU | Geometry Degraded | 426 | 131.0 | Β | Google Baidu |
Bonus Tracks
π We also provide 3 extra datasets from Sensor Fusion Challenge as bonuses in the competition. You will get extra scores if you test your algoithm on Bonus Track and submit the results to us.
Name | Source | Location | Robot | Sensor | Description | Trajectory | Duration | Video | Calibration (Extrinsics) | Calibration (Intrinsics) |
---|---|---|---|---|---|---|---|---|---|---|
Long_Corridor | SubT-MRS | Hawkins | RC2 | RGB,LiDAR IMU | Lidar Degraded | 616.45 | 332 | link | Google Baidu | Google Baidu |
Multi_Floor | SubT-MRS | Hawkins | SP1 | RGB,LiDAR,IMU | Lidar Degraded | 270 | 480 | link | Google Baidu | Google Baidu |
Block_LiDAR | SubT-MRS | Hawkins | SP1 | RGB,LiDAR,IMU | Lidar Degraded | 307.55 | 677 | link | Google Baidu | Google Baidu |
Evaluation
The submission will be ranked based on Absolute Trajectory Error (ATE) and Relative Pose Error (RPE). Specifically, The ATE and RPE of every trajectory in the lidar inertial track and its bonus track will be evaluated. The final score for a submitted trajectory will be assigned according to which interval the weighted sum of the ATE and RPE lies in.
Submit the results.
Prepare the trajectory
For each of the 12 trajectories of LiDAR-inertial track, you need to compute the poses in IMU coordinate frame, and save them in the estimated trajectory text file with the name sequnce_name.txt. Put all 12 files into a zip file with the following structure:
lidar_inertial_track.zip
βββ SubT_MRS_Final_Challenge_UGV1.txt # result file for the trajectory Final_Challenge_UGV1
βββ SubT_MRS_Final_Challenge_UGV2.txt # result file for the trajectory Final_Challenge_UGV2
βββ SubT_MRS_Final_Challenge_UGV3.txt # result file for the Final_Challenge_UGV3
βββ SubT_MRS_Urban_Challenge_UGV1.txt # result file for the Urban_Challenge_UGV1
βββ SubT_MRS_Urban_Challenge_UGV2.txt # result file for the Urban_Challenge_UGV2
βββ SubT_MRS_Laurel_Caverns_Handheld3.txt # result file for the Laureal Cavern
βββ TartanAir_lidar_factory.txt # result file for the trajectory lidar_factory
βββ TartanAir_lidar_ocean.txt # result file for the trajectory lidar_ocean
βββ TartanAir_lidar_sewerage.txt # result file for the trajectory lidar_sewerage
β (Below are Bonuses)
βββ SubT_MRS_Hawkins_Long_Corridor_RC.txt # result file for the trajectory Long Corridor
βββ SubT_MRS_Hawkins_Multi_Floor_LegRobot.txt # result file for the trajectory Multi Floor
βββ SubT_MRS_MILL19_Block_LiDAR.txt # result file for the trajectory Block LiDAR
The estimated_trajecotry.txt file should have the following format:
# timestamp_s tx ty tz qx qy qz qw
1.403636580013555527e+09 0.0 0.0 0.0 0.0 0.0 0.0 0.0
Here are some requirements for your estimated_trajectory.txt
- Each line in the text file contains a single pose.
- The format of each line is βtimestamp_s tx ty tz qx qy qz qwβ.
- tx ty tz (3 floats) give the position of IMU sensor to the world origin in the world frame.
- qx qy qz qw (4 floats) give the orientation of IMU in the form of a unit quaternion with respect to the world frame.
- The trajectory can have an arbitrary initial position and orientation. However, we are using the IMU frame to define the motion. That is to say, the x-axis is pointing to forward, the y-axis is pointing left, the z-axis is pointing up.
Submit in Gradescope
- To submit the estimated trajectory into the submission system, you can follow the steps listed below:
- Register a account in the GradeScope and log into the website.
- Click the right-bottom
Add Course
button and enter the course-entry code:K3EGGJ
, Then you can find theiccv-lii
courses in your GradeScope homepage. - Click the
iccv-lii
course and you will see the assignment namedTrajectory-result-submission
in the dashboard. - Click the assignment and upload your
lidar_inertial_track.zip
file. Also please remember to input the group name as the leaderboard name. Then click the upload button.- You should directly compress the estimated result files of the trajectories into a zip file, not the folder containing the result files.
- After around 1 minutes, you will see the APE and RPE result of your trajectory in the leaderboard.
- Note:
- You must submit all the 12 trajectories for lidar inertial track.
- The trajecotry should be complete. The duration of estimated trajecotry should be roughly same with ground truth trajectory.
Submit Report
Participants are requested to submit a report describing their methods along with the gradescope submission. A template for the same is provided here : ICCV_Template_Report . Please include your report pdf in the lidar_inertial_track.zip
file.
Challenge Rules
- Participants are welcome to form teams. A participant cannot be in multiple teams and a team must make submissions under a single account.
- Every day a team can submit for at most once on gradescope and the submission must be in a certain time window: 12:00 P.M. - 11:59 P.M. UTC.
- The size of every trajectory file submitted should be no more than 2 MB.
- Every team must submit a report along with the gradescope submissions.
- Organizers reserve the right to make changes to the rules and timeline.
- Violation of the rules or other unfair activities may result in disqualification.
πLidar-inertial Leaderboardπ
As of 11:59 PM EDT, October 15th, 2023.
ATE Rank | Team | Mean(1/ATE) |
---|---|---|
1 | HKU-MaRS-Lab | 3.1636 |
2 | Weitong Wu, et al. | 3.0571 |
3 | CILAB@Yonsei | 2.6442 |
4 | Yibin Wu, et al. | 1.3184 |
5 | Shipeng Zhong, et al. | 0.6218 |
6 | Zhiqiang LI | 0.3229 |
RPE Rank | Team | Mean(1/RPE) |
---|---|---|
1 | Weitong Wu, et al. | 16.4595 |
2 | HKU-MaRS-Lab | 14.2443 |
3 | CILAB@Yonsei | 13.3206 |
4 | Yibin Wu, et al. | 8.9971 |
5 | Zhiqiang LI | 8.1355 |
6 | Shipeng Zhong, et al. | 5.2747 |
Note: The ATE(RPE) rank is according to the mean of the reciprocal of each trajectoryβs ATE(RPE) score, weighted by the trajectory length.
Summary
We summarized the SLAM performance of above top teams. Feel free to check our Paper and cite our work.
@InProceedings{Zhao_2024_CVPR,
author = {Zhao, Shibo and Gao, Yuanjun and Wu, Tianhao and Singh, Damanpreet and Jiang, Rushan and Sun, Haoxiang and Sarawata, Mansi and Qiu, Yuheng and Whittaker, Warren and Higgins, Ian and Du, Yi and Su, Shaoshu and Xu, Can and Keller, John and Karhade, Jay and Nogueira, Lucas and Saha, Sourojit and Zhang, Ji and Wang, Wenshan and Wang, Chen and Scherer, Sebastian},
title = {SubT-MRS Dataset: Pushing SLAM Towards All-weather Environments},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2024},
pages = {22647-22657}
}
Contact us
If you have any question or see anything wrong, please do not hesitate to post issues on this github website. We would love to hear from your feedback! Every post will be responded with no spared effort within 36 hours.
Disclaimer
Due to the rapidly changing nature of the SLAM, the above competitation results and SLAM summary is updated on October 15th 2023. In addition, due to the rich body of literature, there may be inaccuracies or mistakes in the paper. We welcome readers to send pull requests to our GitHub repository (inside https://github.com/water-horse/ICCV2023_SLAM_Challenge.git) so we may continue to update our references, correct the mistakes and inaccuracies, as well as updating the entries of the studies in the paper.
License
The SubT-MRS Dataset has been licensed under the CC by 4.0 License and has been collected by the AirLab.