Key Research Articles

Key research papers are selected from a wide variety of peer reviewed journals and are judged to be of major importance in their respective fields.

Clinical Trials

Psychology Progress features clinical trials that are considered to be of key importance in advancing the field of Psychiatry.

Neuroscience Breakthroughs

Psychology Progress features breaking Neuroscience research judged by the advisory team.

Psychologist Profile

Prominent figures in Psychology are featured to highlight their biographies and accomplishments

Events

Notable conferences, summits and other meetings for research discussion and promotion.

Home » Key Research Articles

Emotion recognition using a hierarchical binary decision tree approach

Print Friendly

Chi-Chun Lee, Emily Mower, Carlos Busso, Sungbok Lee, Shrikanth Narayanan
Speech Communication, Volume 53, Issues 9–10, November–December 2011
Abstract

Automated emotion state tracking is a crucial element in the computational study of human communication behaviors. It is important to design robust and reliable emotion recognition systems that are suitable for real-world applications both to enhance analytical abilities to support human decision making and to design human–machine interfaces that facilitate efficient communication. We introduce a hierarchical computational structure to recognize emotions. The proposed structure maps an input speech utterance into one of the multiple emotion classes through subsequent layers of binary classifications. The key idea is that the levels in the tree are designed to solve the easiest classification tasks first, allowing us to mitigate error propagation. We evaluated the classification framework on two different emotional databases using acoustic features, the AIBO database and the USC IEMOCAP database. In the case of the AIBO database, we obtain a balanced recall on each of the individual emotion classes using this hierarchical structure. The performance measure of the average unweighted recall on the evaluation data set improves by 3.37% absolute (8.82% relative) over a Support Vector Machine baseline model. In the USC IEMOCAP database, we obtain an absolute improvement of 7.44% (14.58%) over a baseline Support Vector Machine modeling. The results demonstrate that the presented hierarchical approach is effective for classifying emotional utterances in multiple database contexts.

Highlights

  • We propose a hierarchical structure for multiclass emotion recognition tasks.
  • The structure is designed to first operate on the most differentiable binary task.
  • The framework shows promising accuracy across two emotional databases.
  • The structure won the 2009 Interspeech Emotion Challenge (classifier sub-challenge).

Copyright © 2011 Elsevier B.V.

Go to