DEEP: Deep Emotion Processing for Social Agents


Start: 01.10.2018
Duration: 3 Jahre
Funded by: DFG (Deutsche Forschungsgemeinschaft)
Scientific responsibility: Prof. Dr. Elisabeth André
Invloved Researchers:   Tobias Huber  Dr. Tobias Baur  Alexander Heimerl 
Patrick Gebhard, DFKI

Über das Projekt



The DEEP (funded by DFG, since 2018) project tackles the challenge of connecting the outside world with internal symbolic situational representations that are related to individual human emotions. Therefore, the project creates and evaluates a unique combination of a ml-based real-time interpretation of human social signals and a real-time computational model of emotions in a dyadic communication setup between a human and a Social Agent. The combination relies on a sophisticated representation of communicative emotions, and internal (situational, and structural) emotions, possible emotion elicitors and emotion targets, suitable emotion regulation strategies, and related sequences of social signals and its directions. At runtime, based on the interpretation of the social signals of a human dialog partner, a dynamic theory of mind representation of user emotions is created. It holds all possible internal user emotions with related mental states and cognitive strategies. As a result, this approach allows for a first time a real-time computationally disambiguation of emotion elicitors, emotion targets, and the recognition of possible emotion regulation strategies based on the interpretation of social signals. Overall, the DEEP project realizes a real-time computational model that describes, on a symbolic level, how social cues can be linked to emotional appraisal context and internal emotional states. The model takes the user’s personality, role, status, relation(s), and other individual values into account. The DEEP model will be evaluated in dyadic dialogs between a human and a Social Agent. In the future, the DEEP model can be exploited for the creation and investigation of next generation Social Agent applications by extending the user model of such systems by a real-time model of users’ internal feelings and emotion regulation strategies. These extensions allow a more empathic adaptation to a human user’s current situation.


More Information: