0%

Dataset Download

Update!

All the files have been relinked due to the recent domain merge by the university.
https://global.vcu.edu/newsroom/2020/email/

Please use the link below:
https://drive.google.com/drive/folders/1yO10ty1IbC9aInYz7wdauiFoKlnjZCsc?usp=sharing

device

env

Handheld

According to the data recording conditions and environments, the data sequences can be divided into the following categories:
lab: sequences completely inside the laboratory in which ground truth covers the full trajectory.
lab-easy: move the SC with normal speed, constant illumination, and static objects. simple1 simple2 simple3
lab-motion: move the SC with high speed, constant illumination and static objects. The mean and max of the rotation speed are about 35 and 120 degrees per second, respectively. motion1 motion2motion3 motion4 motion5 motion6
lab-light: move the SC with normal speed, radical changing illumination, and static objects. In some sequences, the illumination condition changes from bright light to complete darkness for about 10 seconds. light1 light2 light3 light4 light5 light6
lab-dynamic: move the SC with normal speed, constant illumination, and dynamic objects. The dynamic objects include one or two persons, a chair, a wheeled robot, or a rollator. dynamic1 dynamic2 dynamic3 dynamic4 dynamic5
corridor: sequences with camera motion along corridors. corridor1 corridor2 corridor3 corridor4
hall: sequences with camera motion around one or two halls and the camera trajectory also covers corridors and stairways. hall1 hall2 hall3

Wheeled Robot

For the data sequences recorded with the mobile robot, they are captured in the laboratory and along the corridors. The wheeled robot is controlled by a BlueTooth connected gamepad to move linearly from point to point. Since the acceleration readings of the IMU installed on a wheeled robot have less variance when it is controlled to move linearly 1. This leads to insufficient IMU excitement, failing to provide sufficient conditions for initialization 2, 1. At the beginning of each sequence, we intentionally deal with this issue in two ways:
manual: handhold and rotate the SC for about five seconds to excite IMU measurements for a good system initialization; then put the SC on the robot and drive the robot to move. lab1 lab2 lab3 corridor1 corridor2
bumper: drive the robot along the bumpers to generate 6 DoF movements, producing more variance to the IMU measurements for initialization purpose; then drive it around the whole area. lab1 lab2 lab3 lab4 lab5 corridor1 corridor2

Download

Please use the reference below if you use the benchmark dataset for evaluation.

  • H. Zhang, L. Jin, C. Ye, “The VCU-RVI Benchmark: Evaluating Visual Inertial Odometry for Indoor Navigation Applications with an RGB-D Camera,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, USA, 2020.
Dataset ROS bag Ground Truth Comment
lab-easy-01 simple1 simple1 Handheld, lab, normal speed, static objects
lab-easy-02 simple2 simple2 Handheld, lab, normal speed, static objects
lab-easy-03 simple3 simple3 Handheld, lab, normal speed, static objects
lab-motion-01 motion1 motion1 Handheld, lab, high speed, static objects
lab-motion-02 motion2 motion2 Handheld, lab, high speed, static objects
lab-motion-03 motion3 motion3 Handheld, lab, high speed, static objects
lab-motion-04 motion4 motion4 Handheld, lab, high speed, static objects
lab-motion-05 motion5 motion5 Handheld, lab, high speed, static objects
lab-motion-06 motion6 motion6 Handheld, lab, high speed, static objects
lab-light-01 light1 light1 Handheld, lab, normal speed, static objects, varying illumination
lab-light-02 light2 light2 Handheld, lab, normal , static objects, varying illumination
lab-light-03 light3 light3 Handheld, lab, normal speed, static objects, varying illumination
lab-light-04 light4 light4 Handheld, lab, normal speed, static objects, varying illumination
lab-light-05 light5 light5 Handheld, lab, normal speed, static objects, varying illumination
lab-light-06 light6 light6 Handheld, lab, normal speed, static objects, varying illumination
lab-dynamic-01 dynamic1 dynamic1 Handheld, lab, normal speed, dynamic objects
lab-dynamic-02 dynamic2 dynamic2 Handheld, lab, normal speed, dynamic objects
lab-dynamic-03 dynamic3 dynamic3 Handheld, lab, normal speed, dynamic objects
lab-dynamic-04 dynamic4 dynamic4 Handheld, lab, normal speed, dynamic objects
lab-dynamic-05 dynamic5 dynamic5 Handheld, lab, normal speed, dynamic objects
handheld-corridor-01 corridor1 corridor1 Handheld, corridor, normal speed, static objects
handheld-corridor-02 corridor2 corridor2 Handheld, corridor, normal speed, static
handheld-corridor-03 corridor3 corridor3 Handheld, corridor, normal speed, static objects
handheld-corridor-04 corridor4 corridor4 Handheld, corridor, normal speed, static objects
handheld-hall-01 hall1 hall1 Handheld, hall, normal speed, static objects
handheld-hall-02 hall2 hall2 Handheld, hall, normal speed, static objects
handheld-hall-03 hall3 hall3 Handheld, hall, normal speed, static objects
robot-lab-01 lab1 lab1 robot, lab, normal speed, static objects
robotlab-02 lab2 lab2 robot, lab, normal speed, static objects
robot-lab-03 lab3 lab3 robot, lab, normal speed, static objects
robot-corridor-01 corridor1 corridor1 robot, corridor, normal speed, static objects
robot-corridor-02 corridor2 corridor2 robot, corridor, normal speed, static objects
bumper-lab-01 lab1 lab1 robot, lab, w/ bumper, static objects
bumper-lab-02 lab2 lab2 robot, lab, w/ bumper, static objects
bumper-lab-03 lab3 lab3 robot, lab, w/ bumper, static objects
bumper-lab-04 lab4 lab4 robot, lab, w/ bumper, static objects
bumper-lab-05 lab5 lab5 robot, lab, w/ bumper, static objects
bumper-corridor-01 corridor1 corridor1 robot, corridor, w/ bumper, static objects
bumper-corridor-02 corridor2 corridor2 robot, corridor, w/ bumper, static objects
calibration-01 camera-imu ——— Dataset for extrinsic calibration
calibration-02 hand-eye ——— Dataset for hand-eye calibration

Calibration

The extrinsic transformation matrix between the coordinate systems of the color camera and the IMU can be obtained from the Structure Core SDK, and it is shown below.

1
2
3
4
5
6
7
8
body_T_cam0: !!opencv-matrix # Timu2c_1 Tu2c
rows: 4
cols: 4
dt: d
data: [0.00193013, -0.999997, 0.00115338, -0.00817048,
-0.999996, -0.0019327, -0.00223606, 0.015075,
0.00223829, -0.00114906, -0.999997, -0.0110795,
0, 0, 0, 1]

In addition, we also recorded a data sequence by moving the SC around an Aprilgrid target about 3 minutes, and we employed the Kalibr toolbox 3 to estimate the extrinsic transformation matrix. However, its output is worse than that provided by SC SDK; therefore, we used the SDK provided transformation matrix in our evaluation.

To allow users to test their calibration algorithms, the calibration data sequences are made available as well. Three data sequences are provided in the dataset:

camera: sequence with slow-motion viewing a grid of AprilTags for camera intrinsics calibration.

camera-imu: sequence with fast motion viewing a grid of AprilTags for camera-imu extrinsic calibration. camera-imu

hand-eye: sequence with slow-motion viewing a checkboard for camera-marker extrinsic calibration. hand-eye

References:

  • [1]: Zeyong Shan, Ruijian Li, and Sören Schwertfeger. Rgbd-inertial trajectory estimation and mapping for ground robots. Sensors, 19(10):2251, 2019.
  • [2]: Tong Qin, Peiliang Li, and Shaojie Shen, “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Transactions on Robotics, 34(4):1004–1020, 2018.
  • [3]: Paul Furgale, Joern Rehder, Roland Siegwart (2013). Unified Temporal and Spatial Calibration for Multi-Sensor Systems. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan.