Spatial orientation is a manifold ability that emerges from perceptual and motor cues, and takes the form of a fully connected hierarchical system. So far, most of the studies have focused on investigating how one or few low-level items shape one high-level item, or how items at the same hierarchy level interact with each other. The complete understanding of how spatial orientation encoding happens is still far. One of the limits to its attainment is the lack of affordable methodologies that can assess, in a controlled fashion and with ecological validity, multiple sensori-motor components and different aspects of spatial orientation. The work presented in this dissertation aimed to extend the range of methodologies for such investigation, focusing on the contribution of head- and trunk-related cues. The work consisted of designing, developing, and testing two virtual reality applications that probe the contribution of multiple head- and trunk-related sensori-motor abilities and their interaction to spatial orientation in alternative ways. The first application, SALLO, is a suite of tools for conducting psychophysical assessments about how specific sensori-motor cues contribute to spatial orientation encoding in virtual reality. It contains tools to design a complete psychophysical task; moreover, it simplifies the stimuli positioning and offers tools to guide the participants’ body movements without physical restraints. The second application, VRCR, is a virtual reality platform to develop and run experiments in the form of first person-perspective archery games. It can disentangle the effects of head- and trunk-related cues such as their motor control, their coordination, and their sensori-motor associations, on spatial orientation encoding, by means of a new type of dynamic sensori-motor manipulation based on designated body-virtual space maps. The two VR applications’ usability and utility were demonstrated in three experiments. The first experiment validated the SALLO suite by replicating a classic psychophysical experiment on audio-spatial perception using SALLO and a consumer-level virtual reality apparatus. The second experiment demonstrated the VRCR platform usability via a case study that found a VRCR-induced head-related audio-motor alteration to affect early blind adults more than sighted blindfolded adults, thus confirming that the brain used head-related audio-motor cues to compensate the role of early vision in the auditory space calibration. The third experiment used SALLO and VRCR together to investigate the type of spatial reasoning sighted adults used in VRCR with and without audio-motor alteration, and found that, while the VRCR baseline task required egocentric reasoning, the audio-motor alteration triggered allocentric reasoning. This result showed that audio-spatial egocentric and allocentric representations are not distinct constructs, and that the head-related sensori-motor cues may be the bridge between them. Altogether, the work presented in this dissertation demonstrated that VRCR and SALLO are valuable and affordable tools that extend the range of methodologies to assess the contribution of head- and trunk-related sensori-motor cues to spatial orientation. In the future, the possibility VRCR offers to probe allocentric reasoning with an egocentric task could pave the way to the development of new protocols which could employ SALLO and VRCR themselves for evaluation and training of allocentric abilities in people whose spatial abilities are poor, such as early blind individuals.
Investigating the sensori-motor contribution of head and trunk to spatial orientation in immersive virtual reality
ESPOSITO, DAVIDE
2022-06-13
Abstract
Spatial orientation is a manifold ability that emerges from perceptual and motor cues, and takes the form of a fully connected hierarchical system. So far, most of the studies have focused on investigating how one or few low-level items shape one high-level item, or how items at the same hierarchy level interact with each other. The complete understanding of how spatial orientation encoding happens is still far. One of the limits to its attainment is the lack of affordable methodologies that can assess, in a controlled fashion and with ecological validity, multiple sensori-motor components and different aspects of spatial orientation. The work presented in this dissertation aimed to extend the range of methodologies for such investigation, focusing on the contribution of head- and trunk-related cues. The work consisted of designing, developing, and testing two virtual reality applications that probe the contribution of multiple head- and trunk-related sensori-motor abilities and their interaction to spatial orientation in alternative ways. The first application, SALLO, is a suite of tools for conducting psychophysical assessments about how specific sensori-motor cues contribute to spatial orientation encoding in virtual reality. It contains tools to design a complete psychophysical task; moreover, it simplifies the stimuli positioning and offers tools to guide the participants’ body movements without physical restraints. The second application, VRCR, is a virtual reality platform to develop and run experiments in the form of first person-perspective archery games. It can disentangle the effects of head- and trunk-related cues such as their motor control, their coordination, and their sensori-motor associations, on spatial orientation encoding, by means of a new type of dynamic sensori-motor manipulation based on designated body-virtual space maps. The two VR applications’ usability and utility were demonstrated in three experiments. The first experiment validated the SALLO suite by replicating a classic psychophysical experiment on audio-spatial perception using SALLO and a consumer-level virtual reality apparatus. The second experiment demonstrated the VRCR platform usability via a case study that found a VRCR-induced head-related audio-motor alteration to affect early blind adults more than sighted blindfolded adults, thus confirming that the brain used head-related audio-motor cues to compensate the role of early vision in the auditory space calibration. The third experiment used SALLO and VRCR together to investigate the type of spatial reasoning sighted adults used in VRCR with and without audio-motor alteration, and found that, while the VRCR baseline task required egocentric reasoning, the audio-motor alteration triggered allocentric reasoning. This result showed that audio-spatial egocentric and allocentric representations are not distinct constructs, and that the head-related sensori-motor cues may be the bridge between them. Altogether, the work presented in this dissertation demonstrated that VRCR and SALLO are valuable and affordable tools that extend the range of methodologies to assess the contribution of head- and trunk-related sensori-motor cues to spatial orientation. In the future, the possibility VRCR offers to probe allocentric reasoning with an egocentric task could pave the way to the development of new protocols which could employ SALLO and VRCR themselves for evaluation and training of allocentric abilities in people whose spatial abilities are poor, such as early blind individuals.File | Dimensione | Formato | |
---|---|---|---|
phdunige_4622994.pdf
accesso aperto
Tipologia:
Tesi di dottorato
Dimensione
5.69 MB
Formato
Adobe PDF
|
5.69 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.