Research Projects

Aesthetics of Performative Interaction for Pervasive Computing Environments in Public Spaces

Principal Investigators

Prof. Dr. Sarah Diefenbach, LMU Munich
Prof. Dr. Marc Hassenzahl, Universität Siegen

A central challenge within the design of pervasive computing environments (PCEs) is how to identify general characteristics that can be transferred across devices and domains, but at the same time, acknowledge contextual demands to provide fulfilling user experience (UX). Our research project addresses this with a focus on the emerging “aesthetics of interaction”, i.e., interaction that “feels” good, because it fits the current context and relevant psychological needs. We apply this approach in the reference scenario of public smart spaces, a scenario where sensible design with regards to experiential aspects and psychological needs seem especially relevant. While users perform and experience their own interaction, they may also think about how others perceive this interaction and the impression they makes on others. Social acceptability, thus, seems crucially relevant for the experience and use of public interactive systems. In a set of systematic studies, we will identify psychological needs and specific requirements for positive experience and interaction in public contexts. Based on this, we will design, prototype and evaluate interaction concepts in concrete application scenarios (e.g., smart restaurant, airport, citizen’s registration office).


Beyond safety and efficiency in acute care: The experience of an embodied staff-environment interaction

Principal Investigators

Dr. Tobias Grundgeiger, Lehrstuhl für Psychologische Ergonomie, Julius-Maximilians-Universität Würzburg
Dr.-Ing. Florian Niebling, Lehrstuhl für Mensch-Computer Interaktion, Julius-Maximilians-Universität Würzburg

Technical developments, evaluations, and research in safety-critical domains focused on a safe and efficient interaction with technology. Our preliminary work indicated that interaction concepts, which are based on modern theories of HCI such as embodied cognition, are more suitable to explain human-technology interaction in such domains. Furthermore, despite constituting a ubiquitous environment, the medical devices are not interconnected; or only for documentation purposes but not for a human-centered pervasive interaction between staff and technology. With a specific focus on user experience, we aim to design and evaluate a pervasive interaction in acute care to contribute to understanding and balancing efficiency and meaningfulness in a pervasive staff-environment interaction while maintaining safety.


Designing and Evaluating Scalable Behavioral Biometrics Systems for Pervasive Computing Environments

Principal Investigators

Prof. Dr. Florian Alt, Bundeswehr University Munich, Munich
Prof. Dr. Stefan Schneegass, University of Duisburg-Essen, Essen

The main challenge this project is addressing is the question how behavioral biometric approaches scale to different pervasive computing environments, containing multiple users with changing behavior, different physicalities, and changing sensing and interaction capabilities.
While behavioral biometrics so far was mainly investigated in the lab for single users, we envision this project to enable a significant leap forward towards behavioral biometrics becoming a powerful means for identifying and authenticating users in future pervasive computing environments that combines high usability with strong security.


Gaze-Assisted Scalable Interaction in Pervasive Classrooms

Principal Investigators

Prof. Dr Anke Huckauf, Ulm University, Germany
Prof. Dr. Enrico Rukzio, Ulm University, Germany

The project is concerned with scalable multi-device interaction paradigms and on scalable multi-user interaction paradigms in pervasive computing environments. We hereby focus on the design of scalable interaction paradigms and develop methods of how we can assess the effectiveness, efficiency and user satisfaction of interactive systems in pervasive environments. T=
hese fields of research are addressed from two different perspectives: From a psychological view, we examine how we can define specific indicators from eye measures that allow us to assess users’ mental states during the interaction with several devices and users. From a computer science view, we research on how to design effective, efficient and satisfying scalable gaze-based interaction techniques.


Gestural interaction paradigms for smart spaces (GrIPSs)

Principal Investigators

Prof. Dr. Susanne Boll, University of Oldenburg
Prof. Dr.-Ing. Antonio Krüger, DFKI & Saarland University (UdS)

The GrIPSs project is motivated by our conviction that we are facing similar problems at this very moment, as a new interaction paradigm shift appears to be necessary to overcome the limitations of 2D-interactive surfaces and the associated touch interaction paradigm. With the advent of intelligent environments (e.g. smart homes) and wearable computing technologies, 2D gestures will slowly disappear and be replaced by more natural interaction modalities, such as voice or spatial 3D-gestures, and take advantage of the whole body for interacting with pervasive computing environments. In this project we aim to understand and support gestures related to interactions particularly in smart environments. Here, we will look at single gestures and gesture sequences carried out not only with one hand, but also bimanually and with support of the whole body.


Illusionary Surface Interfaces

Principal Investigators

Prof. Dr.-Ing Katrin Wolf, Beuth-Hochschule Berlin
Prof. Dr. Albrecht Schmidt, LMU Munich

Traditional computers typically refer (through visual affordances) to both the perceived and actual properties of the interface – suggesting not only fundamental functionalities but also determining and communicating how humans might possibly use the system. Such rich information visualization may, however, not suit the way we want pervasive computers and computational everyday environments to look and feel. We aim to create novel interactive experiences, that exploit multisensory illusions in order to extend the range of interface properties that can be displayed, using only everyday object surfaces as interfaces. In a manner similar to the “rubber hand illusion”, in which people can be induced to perceive a physical touch based purely on what they see, we will support visual and haptic feedback induced by augmented vision and sound. Instead of changing the objects’ physicality, we will visually and auditory augment them using “smart glasses” and projectors, while at the same time augmenting them haptically by inducing multisensory illusion. Technically this includes sensing user interaction using machine learning tools and multimodal presentation of information.


PerforM

Principal Investigators

Prof. Dr. Sarah Diefenbach, LMU Munich
Prof. Dr.-Ing. Andreas Butz, LMU Munich

The “Smart Home” concept promises an intelligent, helpful environment, in which technology makes life easier, simpler or safer for its inhabitants. On a technical level, this is currently achieved by many networked devices interacting with each other, working on shared protocols and standards. From the perspective of user experience (UX), however, configuration of and interaction with such a collection of devices has become so complex that it currently rather stands in the way of widespread adoption and use. Thus, instead of many singular, but interacting intelligent devices, the project “PerforM – Personalities for Machinery in Personal Pervasive Smart Spaces” proposes an overarching interaction concept for the environment as a whole, addressing the mental model of a central, omnipresent “room intelligence”. This room intelligence will control existing UI-less smart home devices, but will also be able to deal with “legacy”, i.e. non-smart machinery or generally any physical object by using a robotic manipulator (for example, a mobile robotic arm). Besides an exploration of an innovative way to address the current challenges of pervasive computing environments (PCEs), our research programme also addresses fundamental questions and gaps in previous research about how different design cues are integrated to an overall perception of “system intelligence”, “entity”, and “personality”.


PervaSafe Computing: Pattern-Based Wearable Assistants for Safety-Critical Human-Computer Interaction in Control Rooms

Principal Investigators

Prof. Dr. Kristof Van Laerhoven, University of Siegen
Prof. Dr. Tilo Mentler, Hochschule Trier

The main objectives of this project are to derive a pattern language for scalable interaction design in control rooms and to design a wearable framework for control room operators, determining how wearable technologies can be used both to implement the aforementioned design patterns and to evaluate their usage in-situ and unsupervised. As part of these studies, control room operators’ cognitive load and affective state are modelled on a user-worn computer and used to influence information flow to the operator. This attention model is used to present alarms and other control room events appropriately. The wearable framework also assists operators in log-keeping of processes/tasks at hand, using wearable sensors that detect specific manual actions. The resulting design patterns and their realization with the aid of wearable assistants will be validated and evaluated in highly realistic but reproducible settings with actual control room operators, with respect to usability and user experience. User experience research is guided by the question: Do control room operators perceive a wearable assistant based on design patterns as patronization (with respect to autonomy or expertise) or support (with respect to safety)?


RIME – Rich Interactive Materials for Everyday Objects in the Home

Principal Investigators

Prof. Dr. Susanne Boll, University of Oldenburg
Prof. Dr. Jan Borchers, RWTH Aachen University,
Prof. Dr. Jürgen Steimle, Saarland University

The key approach of RIME to achieve is to to unlock the interactive potential for rich interaction with the materials in our smart environments. We will be designing, prototyping, and evaluating scalable sensor and actuator technology and touch interaction paradigms for seamless integration into everyday materials and objects, to enable natural and scalable hands-on interactions with our future smart homes. As a result, the physical artifacts in our homes, such as chairs, tables, walls, and other surfaces, can be equipped with an interactive digital “skin”, or contain interactive sensor and actuator materials; and swiping along a table, say, to unfold it for additional guests may become a possible scenario.


Scalable Pervasive Health Environments

Principal Investigators

Prof. Dr. rer. nat. Rainer Malaka, Universität Bremen
Jun.-Prof. Dr.-Ing. Marc Herrlich, TU Kaiserslautern

The project will focus on pervasive environments for health that are scalable across multiple users, multiple devices, and are designed for long-term use. Users will be able to employ exercise games (Exergames) to the benefit of their health using a variety of different devices and sensors available at their home, including existing game-related tracking devices, e.g. the Kinect, but also smartwatches, fitness trackers or other sensors in the smart environment. We will investigate the potential of motivating interfaces that allow for long-term user engagement within these environments while adapting to user specific needs and preferences. We focus on playful interactive interfaces within pervasive environments and examine methods to build them in a scalable way that incorporates existing non-pervasive Exergames and off-the-shelf games that can be turned into pervasive games for health. The second focus area of this project is the use of augmented and virtual reality environments to facilitate scalable development and evaluation of pervasive health environments in complex multi-device scenarios. This includes the application of machine learning (ML) techniques for data-driven long-term evaluation and prediction. This will enable users to play games for health adapted to their specific preferences and needs, utilizing the devices and wearables in their home in a pervasive and scalable way.


User Interaction Concepts based on Prehensile Hand Behavior

Principal Investigators

Prof. Dr.-Ing. Lars Linsen, Westfälische Wilhelms-Universität Münster, Germany
Dr. Dimitar Valkov, Westfälische Wilhelms-Universität Münster, Germany

While interactive 3D environments have gained increasing popularity in the last years, 3D user interfaces still remain rather complex and in many cases require special skills and training. Investigations on reach-to-grasp actions in various domains of psychology have consistently shown that the natural kinematics of prehension allows for predicting the object a human is going to grab and sometimes even the subsequent actions that will be carried out with that object. These insights promise great opportunities for substantially improved interaction in 3D environments, provided that the hand kinematics information is extracted and evaluated on the fly and instantaneously incorporated in the interface.
In this project, we will establish a general framework for the analysis of prehensile behavior in the context of human-computer interaction and we will explore the applicability of prehensile information to design natural user interfaces for the interaction with computer-generated virtual environments.