Tactile data processing and analysis is still essentially an open challenge. In this framework, we demonstrate a method to achieve touch modality classification using pre-trained convolutional neural networks (CNNs). The 3D tensorial tactile data generated by real human interactions on an electronic skin (E-Skin) are transformed into 2D images. Using a transfer learning approach formalized through a CNN, we address the challenging task of the recognition of the object that was touched by the E-Skin. The feasibility and efficiency of the proposed method are proven using a real tactile dataset outperforming classification results obtained with the same dataset in the literature. © 2019 IEEE.
DCNN for Tactile Sensory Data Classification based on Transfer Learning
Alameh M.;Ibrahim A.;Valle M.;Moser G.
2019-01-01
Abstract
Tactile data processing and analysis is still essentially an open challenge. In this framework, we demonstrate a method to achieve touch modality classification using pre-trained convolutional neural networks (CNNs). The 3D tensorial tactile data generated by real human interactions on an electronic skin (E-Skin) are transformed into 2D images. Using a transfer learning approach formalized through a CNN, we address the challenging task of the recognition of the object that was touched by the E-Skin. The feasibility and efficiency of the proposed method are proven using a real tactile dataset outperforming classification results obtained with the same dataset in the literature. © 2019 IEEE.File | Dimensione | Formato | |
---|---|---|---|
IEEE PRIME 2019 Alameh.pdf
accesso chiuso
Descrizione: file in post print
Tipologia:
Documento in versione editoriale
Dimensione
1.03 MB
Formato
Adobe PDF
|
1.03 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.