About Sensor Data

There are two different kinds of robots involved in a TRADR mission: Unmanned Ground Vehicles (UGVs) and Unmanned Aerial Vehicles (UAVs). The UAV “AscTec Neo” is provided by our project partner Ascending Technology GmbH. The UGV is a production of the former project partner BlueBotics from the predecessor project Nifti. Additionally we sometimes use consumer products like DJI phantom.

The TRADR robots can be flexibly equipped with different types of sensors like RGB-camera, laser scanner, stereo-camera, thermo-camera etc.. Perception by the robots takes place in all different levels of the TRADR system, leading from the sensor data on the robotic level to local data (egocentric map) of a single robot, then to global data (allocentric map) created by merging the local data within one sortie, and finally to a world model (environmental 3D details, traversability, points-of-interest, no-go-areas, positions of robots, etc.) on the mission level by collecting the global data of all sorties.

We orientate our datasets to scientest using ROS-based algorithms. Except of data recorded by systems that are not working with ROS, the most datasets are available as ROS bag files. The bag files contain the following message topics (just examples for the draft! adapt to the actual TRADR data!):

  • /camera/camera_info (sensor_msgs/CameraInfo) contains intrinsic camera parameters.

  • /camera/rgbd/camera_info (sensor_msgs/CameraInfo) contains the intrinsic camera parameters for the RGB camera.

  • /camera/rgbd/image_raw (sensor_msgs/Image) contains the color image from the RGB-D camera.

  • /imu (sensor_msgs/Imu), contains the accelerometer data from the IMU.

Definitions of the non-standard messages can be downloaded here.

A detailed ROS-documentation how ROS sensor message can be used and manipulated is found at wiki.ros.org.

Image data from platforms not based on ROS are available as mp4 and avi files.

In the following, intrinsic calibration informations for all cameras are given.

camera fx fy cx cy d0 d1 d2 d3 d4
camera UGV “Delta”
camera UGV “Tango”
another camera

Additionally, we provide the extrinsic parameters of our sensors as follows:

ladybug (in respect to cam 0)

camera par1 par2 par3 par4 par5 par6
Ladybug UGV “Delta”
Ladybug UGV2 “Charlie”