Detecting human activity in the scene is a fundamental skill for robotics. Most current methods of human detection from video analysis rely on appearance or shape features, thus exhibiting severe limitations as clutter or scene complexity grow, for instance when humans are using tools. We propose a way to overcome these limitations by exploiting a motionbased human detection system relying on the regularities of human kinematics. Our method, which is implemented as an open-source software module and integrated in the humanoid robot iCub software framework, extracts relevant features of biological motion in a computationally efficient way and feeds them to the attentional system of the robot. As a result the robot can rapidly direct its attention toward the human agents in the scene, even when they are hidden or partially covered by the tools they are using. The paper describes in detail the software framework supporting the autonomous learning of a discrimination policy between biological and non-biological motion observed in a scene. Then, it provides a quantitative validation of the classification performances on a batch dataset acquired by the iCub cameras and in a scenario where both training and classification are performed online. Last, it presents experiments on the integration of the module with the iCub attention system, demonstrating the ability of the robot to selectively and rapidly redeploy its fixation point on the human activity in the scene. The experimental results show that the proposed system can reliably enable the robot to focus its attention on human activity, a fundamental first step to allow for a deeper understanding of the observed action and a careful planning of an interaction strategy with a human partner.

Biological movement detector enhances the attentive skills of humanoid robot iCub

VIGNOLO, ALESSIA;REA, FRANCESCO;NOCETI, NICOLETTA;ODONE, FRANCESCA;SANDINI, GIULIO
2016-01-01

Abstract

Detecting human activity in the scene is a fundamental skill for robotics. Most current methods of human detection from video analysis rely on appearance or shape features, thus exhibiting severe limitations as clutter or scene complexity grow, for instance when humans are using tools. We propose a way to overcome these limitations by exploiting a motionbased human detection system relying on the regularities of human kinematics. Our method, which is implemented as an open-source software module and integrated in the humanoid robot iCub software framework, extracts relevant features of biological motion in a computationally efficient way and feeds them to the attentional system of the robot. As a result the robot can rapidly direct its attention toward the human agents in the scene, even when they are hidden or partially covered by the tools they are using. The paper describes in detail the software framework supporting the autonomous learning of a discrimination policy between biological and non-biological motion observed in a scene. Then, it provides a quantitative validation of the classification performances on a batch dataset acquired by the iCub cameras and in a scenario where both training and classification are performed online. Last, it presents experiments on the integration of the module with the iCub attention system, demonstrating the ability of the robot to selectively and rapidly redeploy its fixation point on the human activity in the scene. The experimental results show that the proposed system can reliably enable the robot to focus its attention on human activity, a fundamental first step to allow for a deeper understanding of the observed action and a careful planning of an interaction strategy with a human partner.
2016
9781509047185
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/864127
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? ND
social impact