This paper presents the EyesWeb project. The goal of the EyesWeb project is to develop a modular system for the real-time analysis of body movement and gesture. Such information can be used to control and generate sound, music, and visual media, and to control actuators (e.g., robots). Another goal of the project is to explore and develop models of interaction by extending music language toward gesture and visual languages, with a particular focus on the understanding of affect and expressive content in gesture.
EyesWeb – Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
CAMURRI, ANTONIO;VOLPE, GUALTIERO
2000-01-01
Abstract
This paper presents the EyesWeb project. The goal of the EyesWeb project is to develop a modular system for the real-time analysis of body movement and gesture. Such information can be used to control and generate sound, music, and visual media, and to control actuators (e.g., robots). Another goal of the project is to explore and develop models of interaction by extending music language toward gesture and visual languages, with a particular focus on the understanding of affect and expressive content in gesture.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.