The primary objective of this research is to enhance human activity recognition (HAR), with a particular focus on recognizing, tracking and reconstructing human motion within human-robot collaboration (HRC) scenarios. The investigation journey began with a thorough examination of the literature on human activity recognition (HAR) to identify the main approaches in the field. The initial study identified a variety of strategies, including vision-based and wearable sensor approaches, each with distinct strengths and limitations. We chose to center my work around IMU sensors as their small size and precise measurements made them particularly suited for the design of modular wearable motion capture systems. These factors, together with their robustness to occlusion, make these sensors ideal for human-robot cooperation (HRC), where real-time tracking is key for an effective and responsive interaction. Then, we conducted a systematic review on the use of IMUs in HAR, examining existing methods to highlight the open challenges in the field, which include sensor misalignment, interoperability across different IMU devices, lack of standardization and reusability in the hardware design and limited data. To address these challenges, we collaborated in the development and benchmarking of a modular, IMU-based motion capture system optimized for hand tracking (TER Glove). The whole project was open sourced to allow replication and customization of the device in the academic community. Furthermore, to address the problems of misalignment and interoperability, we adopted a sim2real approach, developing a simulated environment that can be used to generate synthetic IMU data from prerecorded human activities. Alongside motion tracking, we integrated the system with tactile sensing from an external device to examine the impact of the combined tactile and motion information in gesture recognition, revealing a more accurate classification of most actions related to touch. In the area of human motion reconstruction, we focused on replicating in-hand manipulation tasks. In particular, we extracted common patterns from human recordings of motion capture data and combined them to generate synthetic dexterous manipulation movements. Lastly, we integrated HAR techniques in real-world human-robot collaboration (HRC) scenarios, in which, integrating the modular IMU-based motion capture system with advanced gesture recognition capabilities, we achieved a flexible and smoother interaction between humans and robots. The literature review and the benchmarking framework proposed for our TER Glove device can facilitate the reproducibility of hardware and applications, enabling researchers to build on existing work more effectively. Additionally, the sim2real approach developed in this research will be open sourced to allow other research groups to contribute with their own data, making the systems more robust and flexible. Additionally, future work could explore the integration of federated learning techniques, allowing for privacy-preserving collaboration across multiple users and settings. As for the integration of HAR in HRC, it can be expanded to encompass multi-robot and multi-human scenarios, extending its application to a broader range of tasks. Lastly, incorporating wearable tactile sensors into this research could enhance the capability of classification systems to recognize and respond to touch-related actions, improving the level of synchronization in collaborative tasks.

Enhancing Human-Robot Collaboration Through Advanced Human Activity Recognition and Motion Tracking

BELCAMINO, VALERIO
2025-06-10

Abstract

The primary objective of this research is to enhance human activity recognition (HAR), with a particular focus on recognizing, tracking and reconstructing human motion within human-robot collaboration (HRC) scenarios. The investigation journey began with a thorough examination of the literature on human activity recognition (HAR) to identify the main approaches in the field. The initial study identified a variety of strategies, including vision-based and wearable sensor approaches, each with distinct strengths and limitations. We chose to center my work around IMU sensors as their small size and precise measurements made them particularly suited for the design of modular wearable motion capture systems. These factors, together with their robustness to occlusion, make these sensors ideal for human-robot cooperation (HRC), where real-time tracking is key for an effective and responsive interaction. Then, we conducted a systematic review on the use of IMUs in HAR, examining existing methods to highlight the open challenges in the field, which include sensor misalignment, interoperability across different IMU devices, lack of standardization and reusability in the hardware design and limited data. To address these challenges, we collaborated in the development and benchmarking of a modular, IMU-based motion capture system optimized for hand tracking (TER Glove). The whole project was open sourced to allow replication and customization of the device in the academic community. Furthermore, to address the problems of misalignment and interoperability, we adopted a sim2real approach, developing a simulated environment that can be used to generate synthetic IMU data from prerecorded human activities. Alongside motion tracking, we integrated the system with tactile sensing from an external device to examine the impact of the combined tactile and motion information in gesture recognition, revealing a more accurate classification of most actions related to touch. In the area of human motion reconstruction, we focused on replicating in-hand manipulation tasks. In particular, we extracted common patterns from human recordings of motion capture data and combined them to generate synthetic dexterous manipulation movements. Lastly, we integrated HAR techniques in real-world human-robot collaboration (HRC) scenarios, in which, integrating the modular IMU-based motion capture system with advanced gesture recognition capabilities, we achieved a flexible and smoother interaction between humans and robots. The literature review and the benchmarking framework proposed for our TER Glove device can facilitate the reproducibility of hardware and applications, enabling researchers to build on existing work more effectively. Additionally, the sim2real approach developed in this research will be open sourced to allow other research groups to contribute with their own data, making the systems more robust and flexible. Additionally, future work could explore the integration of federated learning techniques, allowing for privacy-preserving collaboration across multiple users and settings. As for the integration of HAR in HRC, it can be expanded to encompass multi-robot and multi-human scenarios, extending its application to a broader range of tasks. Lastly, incorporating wearable tactile sensors into this research could enhance the capability of classification systems to recognize and respond to touch-related actions, improving the level of synchronization in collaborative tasks.
10-giu-2025
Robotics; Artificial Intelligence; Wearable Sensors; Human Activity Recognition; Human Robot Interaction; HRI
File in questo prodotto:
File Dimensione Formato  
phdunige_4220471.pdf

accesso aperto

Tipologia: Tesi di dottorato
Dimensione 8.79 MB
Formato Adobe PDF
8.79 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1250636
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact