Researchers have been using different technological solutions (platforms) as intervention tools with children with Autism Spectrum Disorder (ASD), who typically present difficulties in engaging and interacting with their peers. Social robots, one example of these technological solutions, are often unaware of their game partners, preventing the automatic adaptation of their behaviour to the user. Therefore, enriching the interaction between the user and the platform, lightening up the cognitive burden on the human operator, may be a valuable contribution. An information that can be used to enrich this interaction and, consequently, adapt the system behaviour is the recognition of different actions of the user through skeleton pose data from depth sensors. The present work proposes a method to automatically detect in real-time typical and stereotypical actions of children with ASD by using the Intel RealSense and the Nuitrack SDK to detect and extract the user joints coordinates. A Convolution Neural Network learning model trained on the different actions is used to classify the different patterns of behaviour. The model achieved an average accuracy of 92.6±0.5% on the test data. The entire pipeline runs on average at 31 FPS.

Human action recognition using an image-based temporal and spatial representation

Vercelli G.
2020

Abstract

Researchers have been using different technological solutions (platforms) as intervention tools with children with Autism Spectrum Disorder (ASD), who typically present difficulties in engaging and interacting with their peers. Social robots, one example of these technological solutions, are often unaware of their game partners, preventing the automatic adaptation of their behaviour to the user. Therefore, enriching the interaction between the user and the platform, lightening up the cognitive burden on the human operator, may be a valuable contribution. An information that can be used to enrich this interaction and, consequently, adapt the system behaviour is the recognition of different actions of the user through skeleton pose data from depth sensors. The present work proposes a method to automatically detect in real-time typical and stereotypical actions of children with ASD by using the Intel RealSense and the Nuitrack SDK to detect and extract the user joints coordinates. A Convolution Neural Network learning model trained on the different actions is used to classify the different patterns of behaviour. The model achieved an average accuracy of 92.6±0.5% on the test data. The entire pipeline runs on average at 31 FPS.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11567/1069580
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact