Researchers have been using different technological solutions (platforms) as intervention tools with children with Autism Spectrum Disorder (ASD), who typically present difficulties in engaging and interacting with their peers. Social robots, one example of these technological solutions, are often unaware of their game partners, preventing the automatic adaptation of their behaviour to the user. Therefore, enriching the interaction between the user and the platform, lightening up the cognitive burden on the human operator, may be a valuable contribution. An information that can be used to enrich this interaction and, consequently, adapt the system behaviour is the recognition of different actions of the user through skeleton pose data from depth sensors. The present work proposes a method to automatically detect in real-time typical and stereotypical actions of children with ASD by using the Intel RealSense and the Nuitrack SDK to detect and extract the user joints coordinates. A Convolution Neural Network learning model trained on the different actions is used to classify the different patterns of behaviour. The model achieved an average accuracy of 92.6±0.5% on the test data. The entire pipeline runs on average at 31 FPS.

Human action recognition using an image-based temporal and spatial representation

Vercelli G.
2020-01-01

Abstract

Researchers have been using different technological solutions (platforms) as intervention tools with children with Autism Spectrum Disorder (ASD), who typically present difficulties in engaging and interacting with their peers. Social robots, one example of these technological solutions, are often unaware of their game partners, preventing the automatic adaptation of their behaviour to the user. Therefore, enriching the interaction between the user and the platform, lightening up the cognitive burden on the human operator, may be a valuable contribution. An information that can be used to enrich this interaction and, consequently, adapt the system behaviour is the recognition of different actions of the user through skeleton pose data from depth sensors. The present work proposes a method to automatically detect in real-time typical and stereotypical actions of children with ASD by using the Intel RealSense and the Nuitrack SDK to detect and extract the user joints coordinates. A Convolution Neural Network learning model trained on the different actions is used to classify the different patterns of behaviour. The model achieved an average accuracy of 92.6±0.5% on the test data. The entire pipeline runs on average at 31 FPS.
2020
978-1-7281-9281-9
File in questo prodotto:
File Dimensione Formato  
Human_action_recognition_using_an_image-based_temporal_and_spatial_representation.pdf

accesso chiuso

Descrizione: Contributo in atti di convegno
Tipologia: Documento in versione editoriale
Dimensione 1.17 MB
Formato Adobe PDF
1.17 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1069580
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact