Enhancing Augmented VR Interaction via Egocentric Scene Analysis paper has been accepted by a top journal, IMWUT 2018 in November 2018 issue.
The paper was authored by knowledgeable professors from the Department of Computer Science and Engineering, Chinese University of Hong Kong and our very own, Professor Shengdong Zhao.
Abstract
Augmented virtual reality (AVR) takes portions of the physical world into the VR world to enable VR users to access physical objects. State-of-the-art solutions mainly focus on extracting and showing physical objects in the VR world. In this work, we go beyond previous solutions and propose a novel approach to realize AVR. We first analyze the physical environment in the user’s egocentric view through depth sensing and deep learning, then acquire the layout and geometry of the surrounding objects, and further explore their affordances. Based on the above information, we create visual guidance (hollowed guiding path) and hybrid user interfaces (augmented physical notepad, LR finger slider, and LRRL finger slider) to augment the AVR interaction. Empirical evaluations showed that the participants responded positively to our AVR techniques.
Authors
Yang Tian | Department of Computer Science and Engineering, the Chinese University of Hong Kong, Hong Kong, China |
Chi-Wing Fu | Dept. of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China |
Shengdong Zhao | NUS-HCI Lab, National University of Singapore, Singapore, Singapore |
Ruihui Li | 902, Department of Computer Science and Engineering, Chinese University of Hong Kong, HongKong, China |
Xiao Tang | Department of Computer Science and Engineering, the Chinese University of Hong Kong, Hong Kong, China |
Xiaowei Hu | The Chinese University of Hong Kong, Hong Kong, Hong Kong |
Pheng-Ann Heng | Department of Computer Science and Engineering, Chinese University of Hong Kong, HongKong, China |