Touching the Void: Gestures for Auditory Interfaces
Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld devices, there is still little knowledge about the acceptability and use of some of these techniques, especially in the context of an auditory interface. In this paper, we propose a novel approach to the problem by studying the design space of gestures proposed by end-users for a mobile auditory interface. We discuss the results of this explorative study, in terms of the scope of the gestures proposed, the tangible aspects, and the users’ preferences. This study delivers some initial gesture recommendations for eyes-free auditory interfaces.
This project was a collaboration with Christina Dicke and Raphael Grasset.