Independent motion detection aims at identifying elements in the scene whose apparent motion is not due to the robot egomotion. In this work, we propose a method that learns the input-output relationship between the robot motion - described by the position and orientation sensors embedded on the robot - and the sparse visual motion detected by the cameras. We detect independent motion by observing discrepancies (anomalies) between the perceived motion and the motion that is expected given the position and orientation sensors on the robot. We then perform a higher level analysis based on the available disparity map, where we obtain dense profile of the objects moving independently from the robot. We implemented the proposed pipeline on the iCub humanoid robot. In this work, we report a thorough experimental analysis that covers typical laboratory settings, where the effectiveness of the method is demonstrated. The analysis shows in particular the robustness of the method to scene and object variations and to different kinds of robot's movements.
Scheda prodotto non validato
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo
|Titolo:||Object segmentation using independent motion detection|
|Data di pubblicazione:||2015|
|Appare nelle tipologie:||04.01 - Contributo in atti di convegno|