Oliver Beren Kaul

Oliver Beren Kaul
Appelstr. 9A
30167 Hannover
Germany
Room 909

Resume

Oliver Beren Kaul is a researcher in the Human-Computer Interaction Group at the University of Hannover. His main area of interest is Augmented Reality and combining AR applications with new ways of interaction and feedback mechanisms, especially in the realm of tactile feedback for immersive effects and guidance for visually impaired individuals. Side-projects contain subjects in the area of using neural networks for audio classification and enhancing Virtual Reality experiences through tactile effects.

During his studies he focused on Software Engineering, Computer Vision and Human-Computer Interaction for mobile devices. He worked for more than three years as a student research assistant on various projects in the areas of Multi Agent Robotic Systems, Mobile Systems to control robots and Superpixel algorithms running on GPUs. After finishing his studies, he worked as a Software Engineer for iOS and Android for three months before joining the Human Computer Interaction Group as a doctoral student in August '15.

His personal website at www.kaul.me contains a CV and more info on current endeavours.

Projects

HapticHead - Around-the-Head Tactile Display


Publications

Full Papers

Design and Evaluation of On-the-Head Spatial Tactile Patterns Oliver Beren Kaul, Michael Rohs, Marc Mogalle 19th International Conference on Mobile and Ubiquitous Multimedia - MUM '20
        
We propose around-the-head spatial vibrotactile patterns for representing different kinds of notifications. The patterns are defined in terms of stimulus location, intensity profile, rhythm, and roughness modulation. A first study evaluates recall and distinguishability of 30 patterns, as well as agreement on meaning without a predetermined context: Agreement is low, yet the recognition rate is surprisingly high. We identify which kinds of patterns users recognize well and which ones they prefer. Static stimulus location patterns have a higher recognition rate than dynamic patterns, which move across the head as they play. Participants preferred dynamic patterns for comfort. A second study shows that participants are able to distinguish substantially more around-the-head spatial patterns than smartphone-based patterns. Spatial location has the highest positive impact on accuracy among the examined features, so this parameter allows for a large number of levels.
Vibrotactile Funneling Illusion and Localization Performance on the Head Oliver Beren Kaul, Michael Rohs, Benjamin Simon, Kerem Can Demir, Kamillo Ferry Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
        
The vibrotactile funneling illusion is the sensation of a single (non-existing) stimulus somewhere in-between the actual stimulus locations. Its occurrence depends upon body location, distance between the actuators, signal synchronization, and intensity. Related work has shown that the funneling illusion may occur on the forehead. We were able to reproduce these findings and explored five further regions to get a more complete picture of the occurrence of the funneling illusion on the head. The results of our study (24 participants) show that the actuator distance, for which the funneling illusion occurs, strongly depends upon the head region. Moreover, we evaluated the centralizing bias (smaller perceived than actual actuator distances) for different head regions, which also showed widely varying characteristics. We computed a detailed heat map of vibrotactile localization accuracies on the head. The results inform the design of future tactile head-mounted displays that aim to support the funneling illusion.
HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality Oliver Beren Kaul, Michael Rohs Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17
        
Current virtual and augmented reality head-mounted displays usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing multiple vibrotactile actuators distributed in three concentric ellipses around the head for intuitive haptic guidance through moving tactile cues. We conducted three experiments, which indicate that HapticHead vibrotactile feedback is both faster (2.6 s vs. 6.9 s) and more precise (96.4{%} vs. 54.2{%} success rate) than spatial audio (generic head-related transfer function) for finding visible virtual objects in 3D space around the user. The baseline of visual feedback is as expected more precise (99.7{%} success rate) and faster (1.3 s) in comparison, but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments. Mean final precision with HapticHead feedback on invisible targets is 2.3° compared to 0.8° with visual feedback. We successfully navigated blindfolded users to real household items at different heights using HapticHead vibrotactile feedback independently of a head-mounted display.

Short Papers

Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head Oliver Beren Kaul, Kevin Meier, Michael Rohs Human-Computer Interaction -- INTERACT 2017: 16th IFIP TC 13 International Conference, Mumbai, India, September 25-29, 2017, Proceedings, Part IV
        
A high level of presence is an important aspect of immersive virtual reality applications. However, presence is difficult to achieve as it depends on the individual user, immersion capabilities of the system (visual, auditory, and tactile) and the concrete application. We use a vibrotactile grid around the head in order to further increase the level of presence users feel in virtual reality scenes. In a between-groups comparison study the vibrotactile group scored significantly higher in a standardized presence questionnaire compared to the baseline of no tactile feedback. This suggests the proposed prototype as an additional tool to increase the level of presence users feel in virtual reality scenes.

Posters

3DTactileDraw: A Tactile Pattern Design Interface for Complex Arrangements of Actuators Oliver Beren Kaul, Leonard Hansing, Michael Rohs Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems
        
Creating tactile patterns for a grid or a 3D arrangement of a large number of actuators presents a challenge as the design space is huge. This paper explores two different possibilities of implementing an easy-to-use interface for tactile pattern design on a large number of actuators around the head. Two user studies were conducted in order to iteratively improve the prototype to fit user needs.
HapticHead: 3D Guidance and Target Acquisition through a Vibrotactile Grid Oliver Beren Kaul, Michael Rohs Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
        
Current generation virtual reality (VR) and augmented reality (AR) head-mounted displays (HMDs) usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing 20 vibration motors distributed in three concentric ellipses around the head to give intuitive haptic guidance hints and to increase immersion for VR and AR applications. Our user study indicates that HapticHead is both faster (mean=3.7s, SD=2.3s vs. mean=7.8s, SD=5.0s) and more precise (92.7{%} vs. 44.9{%} hit rate) than auditory feedback for the purpose of finding virtual objects in 3D space around the user. The baseline of visual feedback is as expected more precise (99.9{%} hit rate) and faster (mean=1.5s, SD=0.6s) in comparison but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments.
Follow the Force: Steering the Index Finger towards Targets using EMS Oliver Beren Kaul, Max Pfeiffer, Michael Rohs Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
        
In mobile contexts guidance towards objects is usually done through the visual channel. Sometimes this channel is overloaded or not appropriate. A practicable form of haptic feedback is challenging. Electrical muscle stimulation (EMS) can generate mobile force feedback but has a number of drawbacks. For complex movements several muscles need to be actuated in concert and a feedback loop is necessary to control movements. We present an approach that only requires the actuation of six muscles with four pairs of electrodes to guide the index finger to a 2D point and let the user perform mid-air disambiguation gestures. In our user study participants found invisible, static target positions on top of a physical box with a mean 2D deviation of 1.44 cm from the intended target.

Workshop Papers

Concept for Navigating the Visually Impaired using a Tactile Interface around the Head Oliver Beren Kaul, Michael Rohs Hacking Blind Navigation Workshop at CHI '19
     
Requirements of Navigation Support Systems for People with Visual Impairments Oliver Beren Kaul, Michael Rohs Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers
        
Wearable Head-mounted 3D Tactile Display Application Scenarios Oliver Beren Kaul, Michael Rohs Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct