This paper deals with the problem of estimating the pose of tactile elements (i.e. taxels) composing a robotic skin covering the whole body of a robot. This problem arises when a robot skin technology has to be integrated into an already existing robotic platform. To date, the integration process is done by hand and it is not possible to predict where the sensor will be placed on the robot body. This paper presents a novel approach based on a RGB-D camera and exploiting the motion capabilities of the robot for activating the skin sensors. The method uses the measurements of the camera to reconstruct the unknown robot body outer shape and to compute how the area can be touched by the robot. The taxels responses and the related contact centroids are used for estimating the position of the sensors. Our method is based on few assumptions and is a step towards a calibration procedure that can be executed autonomously by a robot. Experiments performed on the Baxter robotic platform demonstrate the effectiveness of the presented approach obtaining an average position error less than 2mm.
|Titolo:||Towards autonomous robotic skin spatial calibration: A framework based on vision and self-touch|
|Data di pubblicazione:||2017|
|Appare nelle tipologie:||04.01 - Contributo in atti di convegno|