This research introduces the first automated recognition of emotional states in horses from facial expressions, moving beyond the typical focus on pain in animal affective computing. The study, conducted by Marcelo Feighelstein, Claire Riccie-Bonot, and their colleagues, developed two AI models: a deep learning model that analyzes video footage and a machine learning model that uses EquiFACS annotations. The video-based model achieved a higher accuracy of 76% in distinguishing four emotional states: baseline, positive anticipation, disappointment, and frustration, indicating that raw video data may contain more nuanced information than manual coding. While the deep learning approach offers superior performance, the EquiFACS-based model provides greater interpretability through its decision tree structure, offering insights into the specific facial cues driving its classifications.
Non-Invasive Computer Vision-Based Fruit Fly Larvae Differentiation: Ceratitis capitata and Bactrocera zonata
This paper proposes a novel, non-invasive method using computer vision