This study explores the automated recognition of emotional states, specifically positive anticipation and frustration, from canine facial expressions. Utilizing a dataset of Labrador Retrievers in a controlled experimental setting, the researchers compared two distinct approaches: a DogFACS-based method and a deep learning technique. The DogFACS-based approach, which used either 11 or 39 DogFACS variables, achieved an accuracy of over 71% when using the full set of 39 variables. The deep learning approach, particularly with a DINO-ViT backbone, outperformed the DogFACS-based method with over 89% accuracy. The research also significantly contributes to the explainability of AI models in animal emotion recognition, providing decision trees for the DogFACS approach and visual heatmaps for the deep learning approach to show how decisions are made, potentially revealing subtle cues not visible to the human eye.
Non-Invasive Computer Vision-Based Fruit Fly Larvae Differentiation: Ceratitis capitata and Bactrocera zonata
This paper proposes a novel, non-invasive method using computer vision