Affective computing may soon be watching your Valence and Arousal

Blog

sales: 01344 567990

support: 01344 989530

Affective computing may soon be watching your Valence and Arousal

Affective computing is the interdisciplinary study of human emotional analysis, synthesis, recognition, and prediction ( Continuous Affect Prediction Using Eye Gaze and Speech , Jonny O’ Dwyer, Ronan Flynn, Niall Murray, Athlone Institute of Technology 2018)

The research the team carried out at the Athlone Institute of Technology, used open source software connected to cameras, to identify the emotions of the participant during testing, previous research has generally relied on headsets and eyetracking devices, using cameras being significantly less intrusive. They focused on a combination of eye gaze and speech to predict the arousal and valence of the participant. As in the image below these are cross related.

Arousal-Valence diagram from Abhang and Gawali

Arousal-Valence diagram from Abhang and Gawali

The arousal can be classed as the amount of energy in the emotion, and the valence the positiveness or negativeness of the experience. The team used OpenFace and then extracted 31 features from the raw data.

They combined this with the audio captured featuring 2268 different features of the speech. Work combined the speech and eye gaze results in a number of different methods, and from that they were able to create predictions of 0.351 for feature fusion and 0.275 for model fusion representing a 3.5% and 19.5% improvement for arousal and valence compared to unimodal performances.

What does this mean for the real world. One potential application will be in health and diagnosis where the true emotions of a subject are needed to understand the interaction, while not having intrusive devices affecting the results.

#telanovaReporter

Make Contact

Click here to open up the message window or call us on 01344 567990

We use cookies to provide you with the best possible experience in your interactions on our website

You agree to our use of cookies on your device by continuing to use our website

I understand