Modeling human affective postures: an information theoretic characterization of posture features
✍ Scribed by P. Ravindra De Silva; Nadia Bianchi-Berthouze
- Publisher
- John Wiley and Sons
- Year
- 2004
- Tongue
- English
- Weight
- 226 KB
- Volume
- 15
- Category
- Article
- ISSN
- 1546-4261
- DOI
- 10.1002/cav.29
No coin nor oath required. For personal study only.
✦ Synopsis
Abstract
One of the challenging issues in affective computing is to give a machine the ability to recognize the mood of a person. Efforts in that direction have mainly focused on facial and oral cues. Gestures have been recently considered as well, but with less success. Our aim is to fill this gap by identifying and measuring the saliency of posture features that play a role in affective expression. As a case study, we collected affective gestures from human subjects using a motion capture system. We first described these gestures with spatial features, as suggested in studies on dance. Through standard statistical techniques, we verified that there was a statistically significant correlation between the emotion intended by the acting subjects, and the emotion perceived by the observers. We used Discriminant Analysis to build affective posture predictive models and to measure the saliency of the proposed set of posture features in discriminating between 4 basic emotional states: angry, fear, happy, and sad. An information theoretic characterization of the models shows that the set of features discriminates well between emotions, and also that the models built over‐perform the human observers. Copyright © 2004 John Wiley & Sons, Ltd.