Towards Designing Eyes-free Interactions for Mobile Word Processingbody athletic interaction for running and cycling

2018.04 | CHI 2018


We present EDITalk, a novel voice-based, eyes-free word processing interface. We used a Wizard-of-Oz elicitation study to investigate the viability of eyes-free word processing in the mobile context and to elicit user requirements for such scenarios. Results showed that meta-level operations like highlight and comment, and core operations like insert, delete and replace are desired by users. However, users were challenged by the lack of visual feedback and the cognitive load of remembering text while editing it. We then studied a commercial-grade dictation application and discovered serious limitations that preclude comfortable speak-to-edit interactions. We address these limitations through EDITalk’s closed-loop interaction design, enabling eyes-free operation of both meta-level and core word processing operations in the mobile context. Finally, we discuss implications for the design of future mobile, voice-based, eyes-free word processing interface.


Eyes-free interfaces; Voice-based word processing; Bargein; Eyes-free interaction design; Conversational UI


Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

Supplemental Material

Project Info



Debjyoti Ghosh, Pin Sym Foong, Shengdong Zhao, Di Chen, Morten Fjeld