SocRob

Official webpage for SocRob, a R&D team project at ISR-IST devoted to research and international robotic competitions (e.g. Robocup).


This is an old revision of the document!


Table of Contents

SocRob Rescue

This team appears as a joint of heterogeneous robots used in ISR-IST for a multitude of projects and research, aiming to not only participate in competitions using those same robots following the guidelines of RoboCup RESCUE, but develop research that contribute to the Urban Search and Rescue panorama as well.

It follows the footprints from RESCUE and RAPOSA projects, in which a tracked wheel robot that served as prototype for RAPOSA-NG was fully developed. Regarding this team focus, it tackles many issues regarding Human Computer Interaction, as aforementioned in project AuReRo (funded by FCT, 2010-2013):

“Field robotics is the use of sturdy robots in unstructured environments. One important example of such a scenario is in Search And Rescue (SAR) operations to seek out victims of catastrophic events in urban environments. While advances in this domain have the potential to save human lives, many challenging problems still hinder the deployment of SAR robots in real situations. This project tackles one such crucial issue: effective real time mapping. To address this problem, we adopt a multidisciplinary approach by drawing on both Robotics and Human Computer Interaction (HCI) techniques and methodologies.”

Robot

RAPOSA-NG

Following the success of RAPOSA, IdMind developed a commercial version, improving it in various ways. Notably, the rigid chassis of RAPOSA, which eventually ends up being plastically deformed by frequent shocks, was replaced by semi-flexible structure, capable of absorbing non-elastical shocks, while significantly lighter than the original RAPOSA.

RAPOSA-NG during RoboCup 2013

ISR acquired a barebones version of this robot, called RAPOSA-NG, and equipped it with a different set of sensors, following lessons learnt from previous research with RAPOSA. In particular, it is equipped with:

  • Stereo camera unit (PointGrey Bumblebee2) on a pan-and-tilt motorized mounting;
  • Laser-Range Finder (LRF) sensor on a tilt-and-roll motorized mounting;
  • Inertial Measurement Unit (IMU).

This equipment was chosen not only to fit better our research interests, but also to aim at the RoboCup Robot Rescue competition. The stereo camera is primarily used jointly with an Head-Mounted Display (HMD) wear by the operator: the stereo images are displayed on the HMD, thus providing depth perception to the operator, while the stereo camera attitude is controlled by the head tracker built-in the HMD. The LRF is being used in one of the following two modes: 2D and 3D mapping. In 2D mapping we assume that the environment is made of vertical walls. However, since we cannot assume horizontal ground, we use a tilt-and-roll motorized mounting to automatically compensate for the robot attitude, such that the LRF scanning plane remains horizontal. An internal IMU measures the attitude of the robot body and controls the mounting servos such that the LRF scanning plane remains horizontal. The IP camera is used for detail inspection: its GUI allows the operator to orient the camera towards a target area and zoom in into a small area of the environment. This is particularly relevant for remote inspection tasks in USAR. The IMU is used both to provide the remote operator with reading of the attitude of the robot, and for automatic localization and mapping of the robot.

Further info can be found in the book chapter Two Faces of Human–Robot Interaction: Field and Service Robots (Rodrigo Ventura), from New Trends in Medical and Service Robots Mechanisms and Machine Science Volume 20, pp 177-192, Sprinter, 2014.

For more information, please click here.