Using Body Posture, Movement and Human Activity to Predict Emotions
Numerous studies on emotion recognition have discovered that facial cues are reliable to distinguish one emotional expression from another. This is useful in highly-controlled environments where the face can easily be tracked. However, this is not true in real-world scenarios, thus, alternative modalities need to be investigated. Bodily expression and human activity can provide additional information to recognize an emotional expression, which will hopefully help create more robust models. Previous works on body movement used acted emotional expressions. To overcome this limitation, this work collected body movement data as human subjects played a game. Subjects are free to move within their assigned space as data was recorded. SVM and Random Forest models predicted valence, arousal and intensity using body posture, movement and activity. Models that reached a validation score of 70% and train-test accuracy of 60% have been analyzed. The best performing model reached a train-test accuracy of 80.2%.