Pervasive Touch, DFG
Methods for Designing Robust Microgestures for Touch-based Interaction
Funded by the DFG
Duration: 2023-2026
Principal Investigators
Prof. Dr. Niels Henze, University of Regensburg
Prof. Dr.-Ing. Katrin Wolf, Berlin University of Applied Sciences and Technology
Current smart objects are mainly controlled through smart speakers or mobile devices. While this can be advantageous for some smart objects, smart tools require explicit and seamless interaction while the device is in use. Touch interaction is the natural candidate as it requires no additional devices. Especially touch gestures are a promising candidate as they are easy to learn, not bound to specific locations, do not require visual feedback, and do not affect the surroundings. Research on gestural interaction heavily relies on designing gestures through elicitation studies. Traditional mobile devices are designed so that they can be naturally held while being used. Half of their surface is reserved for touch interaction and the other half to securely hold them. The surface of smart tools, however, cannot be divided into such discrete regions, as the tools’ grab often overlaps with their user interface. Thus, gestures for smart tools pose what has been coined the Midas touch problem, the challenge of distinguishing between holding the object and activating functions through gestures. Current methods to design gestures do not consider if the gestures can be recognized or can be discriminated from holding the device. Therefore, this project aims to advance methods for designing touch-based gestures with a focus on smart tools by integrating the discriminability of explicit interaction and natural grasp as well as gesture recognizability deep into the design process. We will extend the well-established elicitation method by introducing a discriminability and a recognizability score. The discriminability score describes how well a gesture can be distinguished from naturally holding and using a device, recognizability score describes how well a gesture can be recognized. The scores can be easily combined with the established agreement score resulting from traditional elicitation studies. Thereby, we will be able to design gestures that are not affected by Midas touch and thereby truly usable while being robust and recognizable. Within the project, we will demonstrate the new method by designing gestures for a drilling machine, a cordless screwdriver, and a die grinder, three exemplary smart objects. After prototypically equipping the three device types with capacitive sensors to recognize on-device gestures, we will use them to collect capacitive data of naturally holding and using the devices. The collected dataset will indicate gesture overlaps with natural interaction and thereby derive a discriminability and a recognizability score for a given gesture. We will conduct elicitation studies incorporating the scores and evaluate the resulting gesture sets. By studying the differences between gestures designed through established approaches and gestures designed through our method, we showcase an improved way to design the interaction with handheld devices.