Projects

MAIKI

MAikI: Mobiler Alltagstherapieassistent mit interaktionsfokussierter künstlicher Intelligenz bei Depression

 

BMBF Projekt


Runtime: 01.10.2021 – 31.12.2021


Partners: FlyingHealth Incubator GmbH, GET.ON Institut für Online Gesundheitstrainings GmbH, University of Augsburg

 

Dieses Vorhaben zielt auf einen mobilen digitalen Assistenten ab, der den Patienten interaktiv, intelligent und individualisiert darin unterstützt, seine Therapie im Alltag effektiver umzusetzen. Hierzu werden Methoden künstlicher Intelligenz (Interaktionsanalyse mit Stimmanalyse und Natural Language Processing, Artificial Empathy, Maschinelles Lernen) mit dem Ziel erforscht und entwickelt, die Patienten-Assistenten-Interaktion zu optimieren, therapeutische Interventionen fallspezifisch zu optimieren und deren Umsetzung interaktiv auf intelligente und zugleich personalisierte Art zu unterstützen. Dieser digitale mobile Therapiebegleiter geht über den derzeitigen Stand der Technik hinaus, da er a) eine dauerhafte behandlungsrelevante Kommunikation mit dem Betroffenen aufrechterhält (was bisher die face-to-face Psychotherapie nicht vermag) und b) seine Empfehlungen fortlaufend an aktuelles Erleben, Verhalten und bisherigem Therapieverlauf der Betroffenen anpasst. Auf Basis dieser digitalen Therapieindividualisierung soll der Gesundungsprozess beschleunigt und Rückfallquoten verringert werden.

Leader Humor

A Multimodal Approach to Humor Recognition and an Analysis of the Influence of Leader Humor on Team Performance in Major European Soccer Leagues


DFG (German Research Foundation) Project


Runtime: 36 Months


Partners: University of Passau, University of Augsburg

 

In this project, scholars active in the fields of management and computerized psychometry take the unique opportunity to join their respective perspectives and complementary capabilities to address the overarching question of “How, why, and under which circumstances does leader humor affect team processes and team performance, and how can (leader) humor be measured on a large scale by applying automatic multimodal recognition approaches?”. Trait humor, which is one of the most fundamental and complex phenomena in social psychology, has garnered increasing attention in management research. However, scholarly understanding of humor in organizations is still substantially limited, largely because research in this domain has primarily been qualitative, survey-based, and small scale. Notably, recent advances in computerized psychometry promise to provide unique tools to deliver unobtrusive, multi-faceted, ad hoc measures of humor that are free from the substantial limitations associated with traditional humor measures. Computerized psychometry scholars have long noted that a computerized understanding of humor is essential for the humanization of artificial intelligence. Yet, they have struggled to automatically identify, categorize, and reproduce humor. In particular, computerized approaches have suffered not only from a lack of theoretical foundations but also from a lack of complex, annotated, real-life data sets and multimodal measures that consider the multi- faceted, contextual nature of humor. We combine our areas of expertise to address these research gaps and complementary needs in our fields. Specifically, we substantially advance computerized measures of humor and provide a unique view into the contextualized implications of leader humor, drawing on the empirical context of professional soccer. Despite initial attempts to join computerized psychometry and management research, these two fields have not yet been successfully combined to address our overall research question. We aspire to fill this void as equal partners, united by our keen interest in humor, computerized psychometry, leader rhetoric, social evaluations, and team performance. 

 

 

AUDI0NOMOUS

Agent-based Unsupervised Deep Interactive 0-shot-learning Networks Optimising Machines’ Ontological Understanding of Sound
DFG (German Research Foundation) Reinhart Koselleck-Projekt
# 442218748
 

Soundscapes are a component of our everyday acoustic environment; we are always surrounded by sounds, we react to them, as well as creating them. While computer audition, the understanding of audio by machines, has primarily been driven through the analysis of speech, the understanding of soundscapes has received comparatively little attention.

 

AUDI0NOMOUS, a long-term project based on artificial intelligent systems, aims to achieve a major breakthroughs in analysis, categorisation, and understanding of real-life soundscapes. A novel approach, based around the development of four highly cooperative and interactive intelligent agents, is proposed herein to achieve this highly ambitious goal. Each agent will autonomously infer a deep and holistic comprehension of sound.  A Curious Agent will collect unique data from web sources and social media; an Audio Decomposition Agent will decompose overlapped sounds; a Learning Agent will recognise an unlimited number of unlabelled sound; and, an Ontology Agent will translate the soundscapes into verbal ontologies.

 

AUDI0NOMOUS will open up an entirely new dimension of comprehensive audio understanding; such knowledge will have a high and broad impact in disciplines of both the sciences and humanities, promoting advancements in health care, robotics, and smart devices and cities, amongst many others.

 

Start date: 01.01.2021

 

Duration: 5 years

 

ForDigitHealth

Bayerischer Forschungsverbund zum gesunden Umgang mit digitalen Technologien und Medien
BayFOR (Bayerisches Staatsministerium für Wissenschaft und Kunst) Project

 

Partners: University of Augsburg, Otto-Friedrichs-University Bamberg, FAU Erlangen-Nuremberg, LMU Munich, JMU Würzburg

 

Runtime 2019-2023 (48 Months)   

 

Die Digitalisierung führt zu grundlegenden Veränderungen unserer Gesellschaft und unseres individuellen Lebens. Dies birgt Chancen und Risiken für unsere Gesundheit. Zum Teil führt unser Umgang mit digitalen Technologien und Medien zu negativem Stress (Distress), Burnout, Depression und weiteren gesundheitlichen Beeinträchtigungen. Demgegenüber kann Stress auch eine positive, anregende Wirkung haben (Eustress), die es zu fördern gilt. Die Technikgestaltung ist weit fortgeschritten, sodass digitale Technologien und Medien dank zunehmender künstlicher Intelligenz, Adaptivität und Interaktivität die Gesundheit ihrer menschlichen Nutzerinnen und Nutzer bewahren und fördern können. Ziel des Forschungsverbunds ForDigitHealth ist es, die Gesundheitseffekte der zunehmenden Präsenz und intensivierten Nutzung digitaler Technologien und Medien – speziell in Hinblick auf die Entstehung von digitalem Distress und Eustress und deren Folgen – in ihrer Vielgestaltigkeit wissenschaftlich zu durchdringen sowie Präventions- und Interventionsoptionen zu erarbeiten und zu evaluieren. Dadurch soll der Forschungsverbund zu einem angemessenen, bewussten und gesundheitsförderlichen individuellen wie kollektiven Umgang mit digitalen Technologien und Medien beitragen.

 

 

Huawei & University of Augsburg Joint Lab

The Huawei-University of Augsburg Joint Lab aims to bring together Affective Computing & Human-Centered Intelligence for Human-centred empathic interaction.

 

The Lehrstuhl for Embedded Intelligence in Health Care and Wellbeing is one of two Lehrstuhls in the collaboration.

 

Start date: 01.01.2020

 

Duration: 3 years

 

KIRun

Anfangsdatum: 01.07.2020

 

Laufzeit: 14 Monaten

 

Drittmittelgeber: ZIM, Bundesministerium für Wirtschaft und Energie (BMWi) 

 

Zielsetzung des Projektes „KIRun“ ist die Entwicklung intelligenter Algorithmen, die Messdaten aus Training (GPS), Biomechanik (Inertialsensorik), Physiologie (Atem-/Herzfrequenz), Umgebungsgeräuschen (Laufuntergrund), Tonalität und Qualität der Sprache und Atmung, und gezielter Spracherfassung während des Laufens aufnehmen, verarbeiten und auswerten.

 

Hierzu müssen grundlegend neuartige Funktionalitäten erarbeitet beziehungsweise aus anderen Anwendungsfeldern in den Laufsport transferiert werden. Im Rahmen des Projektes sollen bestehende Technologien aus dem IoT-, Sensorik- und dem Audio-Bereich in den sportwissenschaftlichen Bereich transferiert werden und das Know-how der beteiligten Partner durch eine integrierte KI-Plattform kombiniert werden.

Paralinguistische Stimmcharakteristika bei Depression

Start date: 01.01.2020

 

Duration: 36 Months

 

Funding body: Deutsche Forschungsgemeinschaft (DFG)

 

Homepage:  http://www.psych1.phil.uni-erlangen.de/forschung/weitere-projekte/paralinguistische-stimmcharakteristika-bei-depression.shtml

 

Die Erklärung, Diagnostik, Vorhersage und Behandlung der Major Depression stellen nach wie vor zentrale Herausforderungen der Psychotherapieforschung dar.

 

Als neuer und innovativer Ansatz in der Diagnostik und Therapie der Depression erforscht die Paralinguistik Intonationsmerkmale wie Sprechpausen, Sprachrhythmus, Intonation, Tonhöhe und Lautstärke. In diesem interdisziplinären Projekt arbeiten die klinische Psychologie und Informatik zusammen, um über optimierte Algorithmen Depressionen anhand paralinguistischer Stimmcharakteristika (PSCs) möglichst gut zu erkennen, vorherzusagen und zu klären, inwieweit ein bestimmter Intonationsstil dazu beiträgt, die Depression aufrecht zu erhalten.

 

Darüber hinaus wollen wir die PSCs perspektivisch auch als Therapie einsetzen. Das bedeutet, dass Therapeuten nicht nur, wie gewohnt in der Depressionsbewältigung, mit ihren Patienten erarbeiten, was sie sich sagen, sondern auch wie. Ein Du schaffst das schon! mit leiser, monotoner und kraftloser Stimme wird nichts bewirken, da es nicht emotional überzeugend klingt. Wenn der Satz dagegen mit kraftvoller, deutlicher und dynamischer Stimme ausgesprochen wird, sind die Chance deutlich größer, dass sich damit auch ein Gefühl von Hoffnung und Optimismus auslösen lässt.

 

Das von der DFG geförderte Forschungsprojekt will die wissenschaftliche Grundlage hierfür schaffen. Dazu sollen Sprachproben mit Hilfe von maschinellem Lernen untersucht werden, um Intonationsunterschiede zwischen klinisch-depressiven und nicht-depressiven Personen zu erkennen. Wir werden Algorithmen entwickeln, mit deren Hilfe depressionsrelevante Intonations-Muster identifiziert werden können. Die gewonnenen Erkenntnisse sollen dann wiederum helfen, ein Intonations-fokussiertes Feedback-Training zu entwickeln, das Menschen mit Depressionen helfen soll, depressive Phasen zu bewältigen.

EMBOA

© EMBOA

 

Affective loop in Socially Assistive Robotics as an Intervention Tool for Children with Autism

 

Start date: 01.09.2019

 

Duration: 36 Months

 

Funding body: EU, Erasmus Plus Strategic Partnership

 

Homepage:  emboa.eu

 

Partners: Politechnika Gdanska, University of Hertfordshire, Istanbul Teknik Universitesi, Yeditepe University Vakif, Macedonian association for applied psychology & University of Augsburg


Description: The EMBOA project aims are the development of guidelines and practical evaluation of applying emotion recognition technologies in robot-supported intervention in children with autism.

 

Children with autism spectrum disorder (ASD) suffer from multiple deficits and limited social and emotional skills are among those that influence their ability to involve in interaction and communication. Limited communication occurs in human-human interactions and affects relations with family members, peers and therapists.

 

There are promising results in the use of robots in supporting the social and emotional development of children with autism. We do not know why children with autism are eager to interact with human-like looking robots and not with humans. Regardless of the reason, social robots proved to be a way to get through the social obstacles of a child and make him/her involved in the interaction. Once the interaction happens, we have a unique opportunity to engage a child in gradually building and practicing social and emotional skills.

 

In the project, we combine social robots that are already used in therapy for children with autism with algorithms for automatic emotion recognition. The EMBOA project goal is to confirm the possibility of the application (feasibility study), and, in particular, we aim to identify the best practices and obstacles in using the combination of the technologies. What we hope to obtain is a novel approach for creating an affective loop in child-robot interaction that would enhance interventions regarding emotional intelligence building in children with autism.

 

The lessons learned, summarized in the form of guidelines, might be used in higher education in all involved countries in robotics, computer science, and special pedagogy fields of study. The results will be disseminated in the form of training, multiple events, and to the general public by scientific papers and published reports. The project consortium is multidisciplinary and combines partners with competence in interventions in autism, robotics, and automatic emotion recognition from Poland, UK, Germany, North Macedonia, and Turkey.

 

The methodological approach includes systematic literature reviews and meta-analysis, data analysis based on statistical and machine learning approaches, as well as observational studies. We have planned a double-loop of observational studies. The first round is to analyse the application of emotion recognition methods in robot-based interaction in autism, and to compare diverse channels for observation of emotion symptoms in particular.

 

The lessons learned will be formulated in the form of guidelines. The guidelines will be evaluated with the AGREE (Appraisal of Guidelines, Research, and Evaluation) instrument and confirmed with the second round of observational studies. The objectives of our project match the Social Inclusion horizontal priority with regards to supporting the actions for improvement of learning performance of disadvantaged learners (testing of a novel approach for improvement of learning performances of children with autism).

 

sustAGE

 

© sustAGE

 

Smart Environments for Person-centered Sustainable Work and Well-being

 

Start date: 01.01.2019

 

End date: 30.06.2022

 

Funding body: EU Horizon 2020 Research & Innovation Action (RIA)

 

Homepage: www.sustage.eu  

 

sustAGE will provide a paradigm shift in human machine interaction, building upon seven strategic technology trends, IoT, Machine learning, micro-moments, temporal reasoning, recommender systems, data analytics and gamification to deliver a composite system integrated with the daily activities at work and outside, to support employers and ageing employees to jointly increase well-being, wellness at work and productivity. The manifold contribution focuses on the support of the employment and later retirement of older adults from work and the optimization of the workforce management.

 

The sustAGE platform guides workers on work-related tasks, recommends personalized cognitive and physical training activities with emphasis on game and social aspects, delivers warnings regarding occupational risks and cares for their proper positioning in work tasks that will maximize team performance.

 

By combining a broad range of the innovation chain activities namely, technology R&D, demonstration, prototyping, pilots, and extensive validation, the project aims to explore how health and safety at work, continuous training and proper workforce management can prolongue older workers’ competitiveness at work. The deployment of the proposed technologies in two critical industrial sectors and their extensive evaluation will lead to a ground-breaking contribution that will improve the performance and quality of life at work and beyond for many ageing adult workers.

RADAR-CNS

 

 

© RADAR-CNS

 

 

Remote Assessment of Disease and Relapse – Central Nervous System

Start date: 01.04.2016

 

End date: 31.03.2022

 

Funding body: EU (Europäische Union)

 

Homepage: www.radar-cns.org 

 

RADAR-CNS is a major research programme that is developing new ways of monitoring major depressive disorder, epilepsy, and multiple sclerosis using wearable devices and smartphone technology. RADAR-CNS aims to improve patients’ quality of life, and potentially to change how these and other chronic disorders are treated.

Vergangene Projekte

Hier finden Sie eine Übersicht über vergangene Projekte

Search