Interaction Lab at MobileHCI ’18 in Barcelona, Spain

Overview of our work that will be presented at the MobileHCI ’18 conference in Barcelona, Spain.

 

Tutorial: Machine Learning for Intelligent Mobile User Interfaces using Keras

Monday, September 3rd 09:00-17:30, Room 13.101 – Ramón Turró
High-level APIs such as Keras facilitate the development of deep learning models through a simple interface and enable users to train neural networks within a few lines of code. Building on top of TensorFlow, trained models can be exported and run efficiently on mobile devices. This enables a wide range of opportunities for researchers and developers. In this tutorial, we teach attendees three basic steps to run neural networks on a mobile phone: Firstly, we will teach how to develop neural network architectures and train them with Keras based on the TensorFlow backend. Secondly, we show the process to run the trained models on a mobile phone. In the final part, we demonstrate how to perform human activity recognition using existing mobile device sensor datasets.
Huy Viet Le, Sven Mayer, Abdallah El Ali, Niels Henze

 

Paper: Snooze! Investigating the User-defined Deferral of Mobile Notifications

Tuesday, September 4th 11:00-12:30, Room 30.S01 Auditorium (Paper Session 1: Notifications and Attention)
Notifications on mobile devices are a prominent source of interruptions. Previous work suggests using opportune moments to deliver notifications to reduce negative effects. In this paper, we instead explore the manual deferral of notifications. We developed an Android app that allows users to “snooze” mobile notifications for a user-defined amount of time or to a user-defined point in time. Using this app, we conducted a year-long in-the-wild study with 295 active users. To complement the findings, we recruited 16 further participants who used the app for one week and subsequently interviewed them. In both studies, snoozing was mainly used to defer notifications related to people and events. The reasons for deferral were manifold, from not being able to attend notifications immediately to not wanting to. Daily routines played an important role in the deferral of notifications. Most notifications were deferred to the same day or next morning, and a deferral of more than two days was an exception. Based on our findings, we derive design implications that can inform the design of future smart notification systems.

Dominik Weber, Alexandra Voit, Jonas Auda, Stefan Schneegass, Niels Henze: Snooze! Investigating the User-defined Deferral of Mobile Notifications. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 2:1–2:13, ACM, Barcelona, Spain, 2018, ISBN: 978-1-4503-5898-9.

 

Paper: Designing Finger Orientation Input for Mobile Touchscreens

Wednesday, September 5th 11:00-12:30, Room 30.S01 Auditorium (Paper Session 8: Touch, Gestures and Strokes)
A large number of today’s systems use interactive touch surfaces as the main input channel. Current devices reduce the richness of touch input to two-dimensional positions on the screen. A growing body of work develops methods that enrich touch input to provide additional degrees of freedom for touch interaction. In particular, previous work proposed to use the finger’s orientation as additional input. To efficiently implement new input techniques which make use of the new input dimensions, we need to understand the limitations of the input. Therefore, we conducted a study to derive the ergonomic constraints for using finger orientation as additional input in a two-handed smartphone scenario. We show that for both hands, the comfort and the non-comfort zone depend on how the user interacts with a touch surface. For two-handed smart-phone scenarios, the range is 33.3% larger than for tabletop scenarios. We further show that the phone orientation correlates with the finger orientation. Finger orientations which are harder to perform result in phone orientations where the screen does not directly face the user.

: . .

 

Poster: Virtual Reality on the Go? A Study on Social Acceptance of VR Glasses

Tuesday, September 4th 18:00-20:30, Room 40.SC01, Patio – Roger de Llúria Building
Virtual reality (VR) glasses enable to be present in an environment while the own physical body is located in another place. Recent mobile VR glasses enable users to be present in any environment they want at any time and physical place. Still, mobile VR glasses are rarely used. One explanation is that it is not considered socially acceptable to immerse in another environment in certain situations. We conducted an online experiment that investigates the social acceptance of VR glasses in six different contexts. Our results confirm that social acceptability depends on the situation. In the bed, in the metro, or in a train, mobile VR glasses seem to be acceptable. However, while being surrounded by other people where a social interaction between people is expected, such as in a living room or a public cafe, the acceptance of mobile VR glasses is significantly reduced. If one or two persons wear glasses seems to have a negligible effect. We conclude that social acceptability of VR glasses depends on the situation and is lower when the user is supposed to interact with surrounding people.

Valentin Schwind, Jens Reinhardt, Rufat Rzayev, Niels Henze, Katrin Wolf: Virtual Reality on the Go?: A Study on Social Acceptance of VR Glasses. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 111–118, ACM, Barcelona, Spain, 2018, ISBN: 978-1-4503-5941-2.

 

Demo: Demonstrating Palm Touch: The Palm As an Additional Input Modality on Commodity Smartphones

Tuesday, September 4th 18:00-20:30, Room 40.SC01, Patio – Roger de Llúria Building
Touchscreens are the most successful input method for smartphones. Despite their flexibility, touch input is limited to the location of taps and gestures. We present Palm Touch, an additional input modality that differentiates between touches of fingers and the palm. Touching the display with the palm can be a natural gesture since moving the thumb towards the device’s top edge implicitly places the palm on the touchscreen. We developed a model that differentiates between finger and palm touch with an accuracy of 99.53% in realistic scenarios. In this demonstration, we exhibit different use cases for Palm Touch, including the use as a shortcut and for improving reachability. In a previous evaluation, we showed that participants perceive the input modality as intuitive and natural to perform. Moreover, they appreciate Palm Touch as an easy and fast solution to address the reachability issue during one-handed smartphone interaction compared to thumb stretching or grip changes.

Huy Viet Le, Thomas Kosch, Sven Mayer, Niels Henze: Demonstrating Palm Touch: The Palm As an Additional Input Modality on Commodity Smartphones. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 353–358, ACM, Barcelona, Spain, 2018, ISBN: 978-1-4503-5941-2.