Current Projects



Generating dialog-based analogical explanations about everyday tasks; ?>

Generating dialog-based analogical explanations about everyday tasks This project is part of the SAIL network (Sustainable Life-Cycle of Intelligent Socio-Technical Systems) funded by the Ministry of Culture and Science of the State of NRW. Within SAIL we contribute to the research theme R1: “Human agency to shape cooperative intelligence”. Various tasks and domains exist where […]

More

Scalable hybrid Avatar-Agent-Technologies for everyday social interaction in XR (HiAvA); ?>

HiAvA investigates and develops technologies for enabling multi-user applications in Social VR, mitigating the challenges of social distancing. The goal is to improve upon current solutions by maintaining immersion and social presence even on hardware devices that only allow for limited tracking or rendering. The resulting system should exceed the capabilities of current video communication […]

More

Adaptive generative models for interaction-aware conversational behavior; ?>

Adaptive generative models for interaction-aware conversational behavior

2021-2024

A key challenge for interactive artificial agents is to produce communicative multimodal behavior that is communicatively effective and robust in a given, dynamically evolving interaction context. This project investigates the automatic generation of speech and gesture. We develop cognitive, generative models that incorporate information about the realtime interaction context to allow for adaptive multimodal behavior that can steer […]

More

Creating explanations in collaborative human-machine knowledge exploration; ?>

The Transregional Collaborative Research Center (TRR 318) „Constructing Explainability“ investigates how explanations of algorithmic decisions can be jointly constructed by the explainer and the explainee. The project C05 investigates how human decision makers and intelligent systems can collaboratively explore a decision problem to make a decision that is accountable and hence explainable. The goal is […]

More

Adaptive Explanation Generation; ?>

The Transregional Collaborative Research Center (TRR 318) „Constructing Explainability“ investigates how explanations of algorithmic decisions can be made more efficient by constructing them jointly by the explainer and the explainee. The project A01 “Adaptive explanation generation” investigates the cognitive and interactive mechanisms of adaptive explanations. The goal of our work is to develop a dynamic, […]

More

Computational cognitive modeling of the predictive active self in situated action (COMPAS); ?>

The COMPAS project aims to develop a computational cognitive model of the execution and control of situated action in an embodied cognitive architecture that allows for (1) a detailed explanation, in computational terms, of the mechanisms and processes underlying the sense of agency; (2) simulation of situated actions along with the subjectively perceived sense of […]

More

Realtime mentalizing in cooperative systems; ?>

Realtime mentalizing in cooperative systems

2016-2021

This projects explores how AI-based agents can be equipped with an ability to cooperate grounded in a Theory of Mind, i.e. attribution of hidden mental states to other agents inferred from their observable behavior. In contrast to the usual approach to study this capability in offline, observer-based settings, we aim to fuse mentalizing with strategic […]

More