Recognizing emotions from body movements represents a challenge in affective computing. Most methods in the literature focus on analyzing speech features and facial expressions; yet, even considering body postures and motions can help in identifying emotions. To this end, datasets have been designed to assess upper limb movement and hand gestures. However, even the lower body (legs and feet) can be used to reveal information about the user's attitude. In this paper a new video database for emotion recognition is presented. 16 non-professional actors express four emotions (happiness, interest, disgust, and boredom). The videos have been acquired by using four GoPro cameras to record whole body movements in two different scenarios: observational and interaction with another person. 14 body joints are extracted from each frame of each video and they are used to derive features to be used for emotion identification and recognition.
Mannocchi, I., Lamichhane, K., Carli, M., Battisti, F. (2022). HEROES: A Video-Based Human Emotion Recognition Database. In Proceedings - European Workshop on Visual Information Processing, EUVIP (pp.1-6). 345 E 47TH ST, NEW YORK, NY 10017 USA : Institute of Electrical and Electronics Engineers Inc. [10.1109/EUVIP53989.2022.9922723].
HEROES: A Video-Based Human Emotion Recognition Database
Mannocchi I.;Lamichhane K.;Carli M.;
2022-01-01
Abstract
Recognizing emotions from body movements represents a challenge in affective computing. Most methods in the literature focus on analyzing speech features and facial expressions; yet, even considering body postures and motions can help in identifying emotions. To this end, datasets have been designed to assess upper limb movement and hand gestures. However, even the lower body (legs and feet) can be used to reveal information about the user's attitude. In this paper a new video database for emotion recognition is presented. 16 non-professional actors express four emotions (happiness, interest, disgust, and boredom). The videos have been acquired by using four GoPro cameras to record whole body movements in two different scenarios: observational and interaction with another person. 14 body joints are extracted from each frame of each video and they are used to derive features to be used for emotion identification and recognition.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.