EIHW researcher wins best paper award

Congratulations to our PhD researcher Lukas Stappen, who, along with co-authors, won the best paper award at  MMSP 2019, the 21st IEEE International Workshop on Multimedia Signal Processing

Lukas presented his work in Kuala Lumpur, Malaysia on the 27th September. Titled ‘From Speech to Facial Activity: Towards Cross-modal Sequence-to-Sequence Attention Networks', the paper investigated the use of sequence-to-sequence neural network architectures to predict facial action coding system units ─ specific facial movements in humans ─ purely from speech.

Potential real-world applications of such approaches include the automatic recognition of individuals’ emotions from video footage. Here, facial movements provide valuable information that deep learning algorithms can use to infer the thoughts and feelings of an individual. However, when faces are shaded, obscured or looking away from the camera, the task is much more difficult. 

The cross-modal algorithm described in the winning paper examines the relationship between facial movements and speech from the video footage. It uses the relationship to predict missing facial data, towards the goal of more accurate video sentiment analysis. The researcher’s neural networks predicted facial action units – averaged over eight different units – with an unweighted average recall (an indicator of algorithm performance) of 65.3%. The prize was sponsored by the Global Artificial Intelligence Network (GAIN).


© University of Augsburg