As human beings, having social connections plays a crucial role in our lives, and our interactions are based on communication. Much of the exchange of information occurs in a non-verbal way through what can be described as implicit communication. Indeed, we are exceptionally skilled at reading the unspoken from other people's actions: gaze direction, gestures rhythm, or body posture for instance reveal, even unintentionally, the intentions, mood, or urgency of others' actions. Also when manipulating objects, the way we do it can make explicit otherwise hidden characteristics, such as weight or fragility. My Ph.D. research focuses on the implicit communication of object properties in the context of human-robot interaction. Collaborative robots can benefit significantly from exploiting implicit communication channels as humans do, allowing for more natural, spontaneous, and safe interactions. The main scientific question I aim to answer is whether it is possible to automatically detect features of the carried objects from human movements and, on the other hand, communicate the same information exploiting the robot embodiment. My approach exploits the kinematics modulations that naturally occur when transporting objects: it is precisely the way we interact with them that makes their characteristics understandable to an external observer, resolving at the same time all the possible misunderstandings due to shape, size, or occlusions. To detect and study the kinematics in human actions, I resorted to and compared motion capture systems, inertial sensors, and cameras, exploiting machine learning algorithms to classify the observed motion. To reproduce communicative movements on the robot, I controlled the end-effector kinematics, by modulating its velocity to follow synthetic profiles, automatically generated after training on real human examples. As regards the property to detect and express, I focused mainly on the carefulness associated with object manipulation, i.e., to assess and implicitly communicate if any caution is required to handle the carried item. Findings prove that it is possible exploiting features such as the movement velocity, retrieved with various sensors, to classify online whether an action is careful or not. Exploiting a generative strategy to produce robot motions successfully delivers the intended carefulness and can be applied to different trajectories and robots, even those not humanoid. Moreover, the modulation in the robot's actions also induces a spontaneous motor adaptation in how participants perform their tasks, matching the robot's attitude. Such findings prove that information can be exchanged with robots through implicit cues embedded in the actions, opening a channel of communication that relies on a core human interaction ability. Given the adaptability of the approach to different robots and the non-invasive sensing methods, an industrial field of application seems feasible, beyond the one of social robotics. The ability of the robot to automatically perceive and express the carefulness feature could improve the safety and efficiency of collaborative object manipulation tasks. Future developments of this work may include other object properties in the framework, such as the weight or the temperature, and could exploit additional cues to the kinematics, to expand the possible field of application.

Implicit communication to convey and perceive object properties for human robot interaction

LASTRICO, LINDA
2023-05-08

Abstract

As human beings, having social connections plays a crucial role in our lives, and our interactions are based on communication. Much of the exchange of information occurs in a non-verbal way through what can be described as implicit communication. Indeed, we are exceptionally skilled at reading the unspoken from other people's actions: gaze direction, gestures rhythm, or body posture for instance reveal, even unintentionally, the intentions, mood, or urgency of others' actions. Also when manipulating objects, the way we do it can make explicit otherwise hidden characteristics, such as weight or fragility. My Ph.D. research focuses on the implicit communication of object properties in the context of human-robot interaction. Collaborative robots can benefit significantly from exploiting implicit communication channels as humans do, allowing for more natural, spontaneous, and safe interactions. The main scientific question I aim to answer is whether it is possible to automatically detect features of the carried objects from human movements and, on the other hand, communicate the same information exploiting the robot embodiment. My approach exploits the kinematics modulations that naturally occur when transporting objects: it is precisely the way we interact with them that makes their characteristics understandable to an external observer, resolving at the same time all the possible misunderstandings due to shape, size, or occlusions. To detect and study the kinematics in human actions, I resorted to and compared motion capture systems, inertial sensors, and cameras, exploiting machine learning algorithms to classify the observed motion. To reproduce communicative movements on the robot, I controlled the end-effector kinematics, by modulating its velocity to follow synthetic profiles, automatically generated after training on real human examples. As regards the property to detect and express, I focused mainly on the carefulness associated with object manipulation, i.e., to assess and implicitly communicate if any caution is required to handle the carried item. Findings prove that it is possible exploiting features such as the movement velocity, retrieved with various sensors, to classify online whether an action is careful or not. Exploiting a generative strategy to produce robot motions successfully delivers the intended carefulness and can be applied to different trajectories and robots, even those not humanoid. Moreover, the modulation in the robot's actions also induces a spontaneous motor adaptation in how participants perform their tasks, matching the robot's attitude. Such findings prove that information can be exchanged with robots through implicit cues embedded in the actions, opening a channel of communication that relies on a core human interaction ability. Given the adaptability of the approach to different robots and the non-invasive sensing methods, an industrial field of application seems feasible, beyond the one of social robotics. The ability of the robot to automatically perceive and express the carefulness feature could improve the safety and efficiency of collaborative object manipulation tasks. Future developments of this work may include other object properties in the framework, such as the weight or the temperature, and could exploit additional cues to the kinematics, to expand the possible field of application.
8-mag-2023
implicit communication; human-robot interaction; carefulness; human kinematics; object properties detection; communicative robot movements; human motion
File in questo prodotto:
File Dimensione Formato  
phdunige_4043678.pdf

Open Access dal 09/05/2024

Tipologia: Tesi di dottorato
Dimensione 7.17 MB
Formato Adobe PDF
7.17 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1116835
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact