AI Decodes Emotion Through Movements – Neuroscience News

by thinkia.org.in
0 comment


Summary: A new methodology developed by an international research team uses motion capture and the EMOKINE software to decode emotions from movements. The team recorded a dancer performing choreographies expressing various emotions and analyzed the kinematic features of her movements.

The EMOKINE software, freely available on ZENODO and GitHub, provides an innovative tool for studying emotional expression through whole-body movements. This interdisciplinary approach can benefit experimental psychology, affective neuroscience, and AI-assisted analysis of visual media.

Key Facts:

  1. EMOKINE software analyzes kinematic features of emotional movements.
  2. Motion capture technology recorded a dancer’s movements expressing six emotions.
  3. EMOKINE is open-source and adaptable to various motion capture systems.

Source: Max Planck Institute

Is it possible to decode how we feel from our movements? How can emotions be studied “from the outside” by using empirical methods?

To answer these questions, a large international and interdisciplinary research team led by the Max Planck Institute for Empirical Aesthetics (MPIEA) in Frankfurt am Main, Germany, has developed an integrative scientific methodology.

Using artistic and digital means such as motion capture technology, the researchers developed the EMOKINE software to measure the objective kinematic features of movements that express emotions.

The results of the study have recently been published in the journal Behavior Research Methods.

Movement tracking has been used in many areas in recent years because the objective recording of movement parameters can provide insights into people’s intentions, feelings and state of mind. Credit: Neuroscience News

The team had a professional dancer repeat short dance choreographies in front of a green screen. She was asked to express different emotions through her movements: anger, contentment, fear, happiness, neutrality, and sadness.

To capture the dance movements as “data,” the scientists dived into the MPIEA’s technology pool: the dancer wore a full-body motion capture suit from XSENS®, equipped with a total of 17 highly sensitive sensors.

In combination with a film camera, the dynamic body movements were measured and recorded. The researchers then extracted the objective kinematic characteristics (movement parameters) and programmed the software EMOKINE, which provides these movement parameters from data sets at the touch of a button.

Computerized Tracking for Whole-Body Movement

A total of 32 statistics from 12 movement parameters were compiled and extracted from a pilot dance dataset. The kinematic parameters recorded were, for example, speed, acceleration, or contraction of the limbs.

“We identified 12 kinematic features of emotional whole-body movements that have been discussed separately in the literature about previous research. We then extracted all of them from one same data set, and subsequently fed the features into the EMOKINE software,” reports first author Julia F. Christensen of the MPIEA.

Movement tracking has been used in many areas in recent years because the objective recording of movement parameters can provide insights into people’s intentions, feelings and state of mind. However, this research requires a theory-based methodology so meaningful conclusions can be drawn from the recorded data.

“This work shows how artistic practice, psychology, and computer science can work together in an ideal way to develop methods for studying human cognition,” says co-first author Andrés Fernández of the Max Planck Institute for Intelligent Systems in Tübingen, Germany.

The methodological framework that accompanies the software package, and which explicitly uses dance movements to study emotions, is a departure from previous research approaches, which have often used video clips of “emotional actions,” such as waving hands or walking.

“We are particularly excited about the publication of this work, which involved so many experts, for example from the Goethe University Frankfurt am Main, the University of Glasgow, and a film team from WiseWorld Ai, Portugal.

“It brought together disciplines from psychology, neuroscience, computer science, and empirical aesthetics, but also from dance and film,” summarizes senior author Gemma Roig, Professor of Computer Science, Computational Vision, and AI Lab at Goethe University.

The Open-Source Software Package

EMOKINE is freely available on ZENODO and GitHub and can be adapted to other motion capture systems with minor modifications. These freely available digital tools can be used to analyze the emotional expression of dancers and other groups of artists, and also everyday movements.

The researchers now hope that the EMOKINE software they have developed will be used in experimental psychology, affective neuroscience, and in computer vision—especially in AI-assisted analysis of visual media, a branch of AI that enables computers and systems to extract meaningful information from digital images, videos, and other visual inputs.

EMOKINE will help scientists answer research questions about how kinematic parameters of whole-body movements convey different intentions, feelings, and states of mind to the observer.

About this artificial intelligence research news

Author: Keyvan Sarkhosh
Source: Max Planck Institute
Contact: Keyvan Sarkhosh – Max Planck Institute
Image: The image is credited to Neuroscience News

Original Research: Closed access.
EMOKINE: A software package and computational framework for scaling up the creation of highly controlled emotional full-body movement datasets” by Julia F. Christensen et al. Behavior Research Methods


Abstract

EMOKINE: A software package and computational framework for scaling up the creation of highly controlled emotional full-body movement datasets

EMOKINE is a software package and dataset creation suite for emotional full-body movement research in experimental psychology, affective neuroscience, and computer vision.

A computational framework, comprehensive instructions, a pilot dataset, observer ratings, and kinematic feature extraction code are provided to facilitate future dataset creations at scale.

In addition, the EMOKINE framework outlines how complex sequences of movements may advance emotion research. Traditionally, often emotional-‘action’-based stimuli are used in such research, like hand-waving or walking motions.

Here instead, a pilot dataset is provided with short dance choreographies, repeated several times by a dancer who expressed different emotional intentions at each repetition: anger, contentment, fear, joy, neutrality, and sadness.

The dataset was simultaneously filmed professionally, and recorded using XSENS® motion capture technology (17 sensors, 240 frames/second).

Thirty-two statistics from 12 kinematic features were extracted offline, for the first time in one single dataset: speed, acceleration, angular speed, angular acceleration, limb contraction, distance to center of mass, quantity of motion, dimensionless jerk (integral), head angle (with regards to vertical axis and to back), and space (convex hull 2D and 3D). Average, median absolute deviation (MAD), and maximum value were computed as applicable. 

The EMOKINE software is appliable to other motion-capture systems and is openly available on the Zenodo Repository.

Releases on GitHub include: (i) the code to extract the 32 statistics, (ii) a rigging plugin for Python for MVNX file-conversion to Blender format (MVNX=output file XSENS® system), and (iii) a Python-script-powered custom software to assist with blurring faces; latter two under GPLv3 licenses.

You may also like

Thinkia is a professional platform where we provide informative content like current world news, all types of educational content, health awareness, food awareness, travel awareness, ideas and tips. We hope you like all the content provided by us.

Editors' Picks

Latest Posts

Copyright © 2024 | Thinkia | All Right Reserved