The thesis deals with two extreme end of speech perception in cognitive neuroscience. On its one end it deals with single isolated person brain responses to acoustic stimulus and missing articulatory cues, and on the other end it explores the neural mechanisms emerging while speech is embedded in a true conversational interaction. Studying these two extremities requires the use of relatively different methodological approaches. In fact, the first approach has seen the consolidation of a wide variety of experimental designs and analytical methods. Otherwise, the investigation of speech brain processes during a conversation is still in its early infancy and several technical and methodological challenges still needs to be solved. In the present thesis, I will present one EEG study using a classical attentive speech listening task, analyzed by using recent methodological advancement explicitly looking at the neural entrainment to speech oscillatory properties. Then, I will report on the work I did to design a robust speech-based interactive task, to extract acoustic and articulatory indexes of interaction, as well as the neural EEG correlates of its word-level dynamics. All in all, this work suggests that motor processes play a critical role both in attentive speech listening and in guiding mutual speech accomodation. In fact, the motor system, on one hand reconstruct information that are missing in the sensory domain and on the other hand drives our implicit tendency to adapt our speech production to the conversational partner and the interactive dynamics.
|Titolo della tesi:||Sensorimotor processes in speech listening and speech-based interaction|
|Data di discussione:||22-feb-2019|
|Appare nelle tipologie:||Tesi di dottorato|