LUI:

Lip in Multimodal Mobile GUI Interaction

2012.10 | ICMI 2012

Introduction

Gesture based interactions are commonly used in mobile and ubiquitous environments. Multimodal interaction techniques use lip gestures to enhance speech recognition or control mouse movement on the screen. In this paper we extend the previous work to explore LUI: lip gestures as an alternative input technique for controlling the user interface elements in a ubiquitous environment. In addition to use lips to control cursor movement, we use lip gestures to control music players and activate menus. A LUI Motion-Action library is also provided to guide future interaction design using lip gestures.

Keyword

multimodal mobile interaction, gesture input, lui

Publication

Proceedings of the 14th ACM international conference on Multimodal interaction

Project Info

Date:

2012-10

Author:
Maryam Azh and Shengdong Zhao