A human machine interface (HMI) is presented to switch on/off lights according to the head left/right yaw rotation. The HMI consists of a cap, which can acquire the brain's electrical activity (i.e., an electroencephalogram, EEG) sampled at 500 Hz on 8 channels with electrodes that are positioned according to the standard 10-20 system. In addition, the HMI includes a controller based on an input-output function that can compute the head position (defined as left, right, and forward position with respect to yaw angle) considering short intervals (10 samples) of the signals coming from three electrodes positioned in O1, O2, and Cz. An artificial neural network (ANN) training based on a Levenberg-Marquardt backpropagation algorithm was used to identify the input-output function. The HMI controller was tested on 22 participants. The proposed classifier achieved an average accuracy of 88% with the best value of 96.85%. After calibration for each specific subject, the HMI was used as a binary controller to verify its ability to switch on/off lamps according to head turning movement. The correct prediction of the head movements was greater than 75% in 90% of the participants when performing the test with open eyes. If the subjects carried out the experiments with closed eyes, the prediction accuracy reached 75% of correctness in 11 participants out of 22. One participant controlled the light system in both experiments, open and closed eyes, with 100% success. The control results achieved in this work can be considered as an important milestone towards humanoid neck systems.
Binary Controller Based on the Electrical Activity Related to Head Yaw Rotation
Zero, E;Bersani, C;Sacile, R
2022-01-01
Abstract
A human machine interface (HMI) is presented to switch on/off lights according to the head left/right yaw rotation. The HMI consists of a cap, which can acquire the brain's electrical activity (i.e., an electroencephalogram, EEG) sampled at 500 Hz on 8 channels with electrodes that are positioned according to the standard 10-20 system. In addition, the HMI includes a controller based on an input-output function that can compute the head position (defined as left, right, and forward position with respect to yaw angle) considering short intervals (10 samples) of the signals coming from three electrodes positioned in O1, O2, and Cz. An artificial neural network (ANN) training based on a Levenberg-Marquardt backpropagation algorithm was used to identify the input-output function. The HMI controller was tested on 22 participants. The proposed classifier achieved an average accuracy of 88% with the best value of 96.85%. After calibration for each specific subject, the HMI was used as a binary controller to verify its ability to switch on/off lamps according to head turning movement. The correct prediction of the head movements was greater than 75% in 90% of the participants when performing the test with open eyes. If the subjects carried out the experiments with closed eyes, the prediction accuracy reached 75% of correctness in 11 participants out of 22. One participant controlled the light system in both experiments, open and closed eyes, with 100% success. The control results achieved in this work can be considered as an important milestone towards humanoid neck systems.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.