In this paper, we designed a registration framework that can be used to develop augmented reality environments, where all the real (including the users) and virtual elements are co-localized and registered in a common reference frame. The software is provided together with this paper, to contribute to the research community. The developed framework allows us to perform a quantitative assessment of interaction and egocentric perception in Augmented Reality (AR) environments. We assess perception and interaction in the peripersonal space through a 3D blind reaching task in a simple scenario and an interaction task in a kitchen scenario using both video (VST) and optical see-through (OST) head-worn technologies. Moreover, we carry out the same 3D blind reaching task in real condition (without head-mounted display and reaching real targets). This provides a baseline performance with which to compare the two augmented reality technologies. The blind reaching task results show an underestimation of distances with OST devices and smaller estimation errors in frontal spatial positions when the depth does not change. This happens with both OST and VST devices, compared with the real-world baseline. Such errors are compensated in the interaction kitchen scenario task. Thanks to the egocentric viewing geometry and the specific required task, which constrain the position perception on a table, both VST and OST have comparable and effective performance. Thus, our results show that such technologies have issues, though they can be effectively used in specific real tasks. This does not allow us to choose between VST and OST devices. Still, it provides a baseline and a registration framework for further studies and emphasizes the specificity of perception in interactive AR.

A Registration Framework for the Comparison of Video and Optical See-Through Devices in Interactive Augmented Reality

Ballestin G.;Chessa M.;Solari F.
2021-01-01

Abstract

In this paper, we designed a registration framework that can be used to develop augmented reality environments, where all the real (including the users) and virtual elements are co-localized and registered in a common reference frame. The software is provided together with this paper, to contribute to the research community. The developed framework allows us to perform a quantitative assessment of interaction and egocentric perception in Augmented Reality (AR) environments. We assess perception and interaction in the peripersonal space through a 3D blind reaching task in a simple scenario and an interaction task in a kitchen scenario using both video (VST) and optical see-through (OST) head-worn technologies. Moreover, we carry out the same 3D blind reaching task in real condition (without head-mounted display and reaching real targets). This provides a baseline performance with which to compare the two augmented reality technologies. The blind reaching task results show an underestimation of distances with OST devices and smaller estimation errors in frontal spatial positions when the depth does not change. This happens with both OST and VST devices, compared with the real-world baseline. Such errors are compensated in the interaction kitchen scenario task. Thanks to the egocentric viewing geometry and the specific required task, which constrain the position perception on a table, both VST and OST have comparable and effective performance. Thus, our results show that such technologies have issues, though they can be effectively used in specific real tasks. This does not allow us to choose between VST and OST devices. Still, it provides a baseline and a registration framework for further studies and emphasizes the specificity of perception in interactive AR.
File in questo prodotto:
File Dimensione Formato  
A_Registration_Framework_for_the_Comparison_of_Video_and_Optical_See-Through_Devices_in_Interactive_Augmented_Reality.pdf

accesso aperto

Descrizione: Articolo su rivista
Tipologia: Documento in versione editoriale
Dimensione 3.31 MB
Formato Adobe PDF
3.31 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/1070848
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 11
social impact