Enhancing Augmented VR Interaction via Egocentric Scene Analysis

2019.09 | IMWUT 2019

Introduction

Augmented virtual reality (AVR) takes portions of the physical world into the VR world to enable VR users to access physical objects. State-of-the-art solutions mainly focus on extracting and showing physical objects in the VR world. In this work, we go beyond previous solutions and propose a novel approach to realize AVR. We first analyze the physical environment in the user’s egocentric view through depth sensing and deep learning, then acquire the layout and geometry of the surrounding objects, and further explore their affordances. Based on the above information, we create visual guidance (hollowed guiding path) and hybrid user interfaces (augmented physical notepad, LR finger slider, and LRRL finger slider) to augment the AVR interaction. Empirical evaluations showed that the participants responded positively to our AVR techniques

Keyword

virtual reality, visual tool, deep learning, scene analysis, depth sensing, egocentric view, visual guidance, augmented VR interaction

Publication

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

Project Info

Date:

2019-09

Author:
Yang Tian, Chi-Wing Fu, Shengdong Zhao, Ruihui Li, Xiao Tang, Xiaowei Hu, and Pheng-Ann Heng