The third TRADR General Assembly is going to be held at the TU in Delft from May 3-4, 2017.
Ivana Kruijff-Korbayová and Hartmut Surmann (re)presented TRADR at the European Robotics Forum in Edinburgh, March 22-24 2017. In the session “Success Stories: Robotics for Disaster Response” Ivana spoke about TRADR objectives and recent results, including the deployment of TRADR robots in response to the earthquake in Amatrice. Alongside speakers from several other EU projects dealing with disaster response and search & rescue she was also one of the panelists in a discussion which addressed S&R competitions, involvement of companies, benchmarking and system integration.
In the AI & Cognitive Robotics session “AI for long-term autonomy in robot applications” Ivana delivered a position statement based on TRADR experience and participated in group discussions. Similarly to the speakers representing other domains of application she underlined the need for learning from experience, e.g., to avoid repeating a mistake; the need for coping with complex, unknown, unpredictable and dynamic environments; and the need for communication between robots and humans about mission progress, including environment changes.
The TRADR Year 3 review took place at the Scuola Di Formazione Operativa in Montelibretti, (Italy) of Vigili del Fuoco on Tuesday March 7 and Wednesday March 8.
The system demonstration was successful and the overall progress of the project was evaluated by the predicate “excellent” or “very good”.
More detailed information will follow soon.
The objective of the TRADR project is to enable a team of humans and robots to collaborate in a disaster response scenario which can last over several days. To achieve this, one of the core capabilities of the robots is their capacity to create a 3D map of their environment and to localize themselves within this map.
The TRADR consortium has recently open-sourced three libraries with the purpose of enabling robots equipped with 3D laser scanners to perform the above mentioned tasks. The curves library is used to represent the robot’s continuous-time trajectory in 3D space while the laser_slam library implements the back-end estimation functionalities of the localization and mapping system. The SegMatch library finally enables the robots to recognize previously visited places and to transmit this information to the back-end in order to close loops and to register trajectories of different robots.
Links to libraries:
The following figure illustrates a map which was generated by fusing 3D laser scanner measurements from two unmanned ground vehicles which were collected during the TRADR Evaluation exercise at the Gustav Knepper Power Station in Dortmund, Germany. The map is coloured by height and the robot trajectories are represented as blue and red lines.
For more information about the place recognition algorithm please consult our paper (https://arxiv.org/pdf/1609.07720v1.pdf) and have a look at our video (https://www.youtube.com/watch?v=iddCgYbgpjE). Easy to run demonstrations can be found in the wiki page of the SegMatch repository. More to come!