Publications

Automated Recognition of Emotional States of Horses from Facial Expressions

Marcelo Feighelstein, Claire Ricci-Bonot, Hana Hasan, Hallel Weinberg, Tidhar Rettig, Maya Segal, Tomer Distelfeld, Ilan Shimshoni, Daniel S Mills, Anna Zamansky

This research introduces the first automated recognition of emotional states in horses from facial expressions, moving beyond the typical focus on pain in animal affective computing. The study, conducted by Marcelo Feighelstein, Claire Riccie-Bonot, and their colleagues, developed two AI models: a deep learning model that analyzes video footage and a machine learning model that uses EquiFACS annotations. The video-based model achieved a higher accuracy of 76% in distinguishing four emotional states: baseline, positive anticipation, disappointment, and frustration, indicating that raw video data may contain more nuanced information than manual coding. While the deep learning approach offers superior performance, the EquiFACS-based model provides greater interpretability through its decision tree structure, offering insights into the specific facial cues driving its classifications.

Now Available in Audio!
Listen to our publication as a podcast. 

Disclaimer: This content was generated using AI tools and is intended for informational purposes only.

Check out MELD

Our new facial analysis tool