Scientists are training artificial intelligence to recognize animal emotions.
Several research teams are developing systems that will improve disease diagnosis and help assess the condition of animals, according to the journal Science.
Researchers from the University of the West of England in Bristol and the Scottish Agricultural College are creating Intellipig, an AI system that analyzes pig facial expressions and alerts farmers to signs of pain, illness, or stress.
Meanwhile, a team from the University of Haifa in Israel is working on an AI model that detects emotions in animals with facial movements similar to humans. Their study shows that humans share 38% of facial expressions with dogs, 34% with cats, and 47% with primates and horses.
The researchers plan to launch an AI-based app that allows cat owners to scan their pet’s face within 30 seconds and receive precise feedback, such as: “Significant tension around the mouth detected; moderate pain level.”
This team had previously developed an AI system for facial recognition in animals, which helps locate lost pets by comparing their photos with shelter databases.
Animal Emotions
Scientists have long understood that, much like humans, animals express their emotions through their faces. In The Expression of the Emotions in Man and Animals, published in 1872, Charles Darwin proposed that facial expressions function as a kind of “shared language” among mammals—an ability rooted deep in evolutionary history.
Darwin’s theory was heavily based on anatomical similarities, explains Bridget Waller, a psychologist at Nottingham Trent University. Mammals, including humans, possess many of the same facial muscles designed to produce expressions. For example, humans share 38% of their facial movements with dogs, 34% with cats, and as much as 47% with primates and horses.
However, despite these anatomical parallels, interpreting animal expressions isn’t as straightforward as reading human faces. Researchers studying animal communication typically rely on contextual clues to determine what an animal is experiencing. Pain is a prime example: a horse recovering from castration or a limping sheep with an infected hoof is almost certainly in pain.
To better understand expressions of discomfort, researchers may apply mild stimuli, such as tightening a blood pressure cuff around a leg or dabbing chili extract onto the skin. Likewise, when an animal receives pain relief, visible signs of discomfort often diminish.
Stress is another emotion that can be triggered in many species with minimal intervention. Simply taking a horse or a cat on a short drive or briefly separating them from their companions can induce minor anxiety.
To observe stress in young sows, researcher Baxter introduces older, dominant pigs, whose intimidating behavior prompts clear stress signals. Behaviors such as vocalizing, defecating, and surging cortisol levels all confirm that an animal is experiencing stress.
To systematically measure these responses, scientists have spent thousands of hours meticulously observing animal faces in various painful or stressful conditions, comparing them to those of animals that are presumably pain- or stress-free.
Their efforts have led to the creation of “grimace scales” for different species—tools that assess an animal’s pain or stress level based on specific facial muscle movements.
In the Netherlands, scientists have created a similar AI application that scans horses’ faces and bodies to evaluate their pain levels. They envision AI playing a role in equestrian competitions by rewarding riders with happy, well-cared-for horses, as well as promoting fairness and animal welfare in sports.
For example, a horse that rotates its ears outward while forming “worry wrinkles” above its eyes is more likely to be in pain than one with a relaxed expression, explains Pia Haubro Andersen, a horse surgeon at the Swedish University of Agricultural Sciences. Stress in horses manifests through similar ear and wrinkle patterns, with subtle distinctions such as sticking out the tongue, she adds.
Experts have become adept at manually coding these facial expressions, a process that could theoretically be used for routine welfare monitoring. However, as Andersen points out, it is an incredibly time-consuming task. A trained human coder takes around 100 seconds to analyze the facial muscles in a single image, or two to three hours to assess just 30 seconds of video footage.
AI, by contrast, can perform the same analysis almost instantly—but first, it must be trained.