Even though, three-dimensional representations of architectural models exist, experiencing these models like one would experience a fully constructed building is still a major challenge. With Virtual Reality (VR) it is now possible to experience a number of scenarios in a virtual environment. Also prototyping interactive architecture elements, which might be very expensive, becomes possible. […]
With the advent of lifelogging cameras the amount of personal video material is massively growing to an extent that easily overwhelms the user. To efficiently review lifelog data, we need well designed video navigation tools. In this project, we analyze which cues are most beneficial for lifelog video navigation. We show that the information […]
We investigate intimate proxemic zones of exhibits, which are zones that visitor will respect and hence not enter. In a first experiment we show that exhibits have, like humans, an intimate proxemic zone that is respected by exhibition visitors and which is 27cm. A zone around an exhibit that visitors will not enter can be […]
Tangible user interfaces (TUIs) have been proposed to interact with digital information through physical objects. However being investigated since decades, TUIs still play a marginal role compared to other UI paradigms. This is at least partially because TUIs often involve complex hardware elements, which make prototyping and production in quantities difficult and expensive. In […]
The work presented here aims to enrich material perception when touching interactive surfaces. This is realized through simulating changes in the perception of various material properties, such as softness and bendability. The thereby created perceptual illusions of surface changes are induced using electrotactile stimuli and texture projection as touch/pressure feedback. A metal plate with […]
We are frequently switching between devices, and currently we have to unlock most of them. Ideally such devices should be seamlessly accessible and not require an unlock action. We introduce PickRing, a wearable sensor that allows seamless interaction with devices through predicting the intention to interact with them through the device’s pick-up detection. A […]
This project, which is my PhD topic, develops guidelines for a future device type: a tablet that allows ergonomic front- and back-of-device interaction. These guidelines are derived from empirical studies and developed to fit the users’ skills to the way the novel device type is held. Three particular research areas that are relevant to […]
Tangible User Interfaces (TUIs) represent digital information via a number of sensory modalities including the haptic, visual and auditory senses. We suggest that interaction with tangible interfaces is commonly governed primarily through visual cues, despite the emphasis on tangible representation. We do not doubt that visual feedback offers rich interaction guidance, but argue that […]
Although Data Gloves allow for the modeling of the human hand, they can lead to a reduction in usability as they cover the entire hand and limit the sense of touch as well as reducing hand feasibility. As modeling the whole hand has many advantages (e.g. for complex gesture detection) we aim for modeling […]
We present a wearable interface that consists of motion sensors. As the interface can be worn on the user’s fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor’s signal. We implemented four […]
This demonstration was presented at World Haptics Conference 2011. We showed a finger worn accelerometer interface for modifying the stroke attributes while drawing on a touch surface. We support input through finger touch and pen. In example, a scroll movement with the index finger (or pen) changes the stroke width, and shaking the hand […]
Regardless of how gestural phone interaction (like pinching on a touch screen for content zooming) is implemented in almost any mobile device; there are still no design guidelines for gestural control. These should be designed with respect to ergonomics and hand anatomy. There are many human-side aspects to take care of when designing gestures. […]
This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb […]
This paper explores how microgestures can allow us to execute a secondary task, for example controlling mobile applications, without interrupting the manual primary task, for instance, driving a car. In order to design microgestures iteratively, we interviewed sports- and physiotherapists while asking them to use task related props, such as a steering wheel, […]
Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld devices, there is still […]
Graphical user interfaces for mobile devices have several drawbacks in mobile situations. In this paper, we present Foogue, an eyes-free interface that utilizes spatial audio and gesture input. Foogue does not require visual attention and hence does not divert visual attention from the task at hand. Foogue has two modes, which are designed to fit […]