We, humans, have many ways to communicate with each other, and no doubt the simplest one is our facial-emotions, but are we the only ones who have a wide range of emotions. A recently published study claims that not only humans but mice also capable of displaying different sorts of facial expressions.
Neuroscientist Nadine Gogolla of the Max Planck Institute of Neurobiology in Martinsried, Germany has used an artificial intelligence-powered system to decode the facial expressions of laboratory mice.
“Mice exhibit facial expressions that are specific to the underlying emotions,” said Dr. Nadine Gogolla. She further continues and said that the findings were important, as they offer researchers new ways to measure the intensity of emotional responses, which could help them probe how emotions arise in the brain.
Nadine Gogolla and the team has made this discovery by recording the faces of lab mice when they were exposed to different stimuli, such as sweet flavors and electric shocks, and with the help of AI tools, the researchers reliably spotted mice’s expressions of joy, fear, pain and other basic emotions.
The researchers said that the sugar water evoked pleasure, a shock to the tail triggered the pain, bitter quinine water created disgust, an injection of lithium chloride evoked a nauseated malaise & a place where shocks previously had been delivered sparked fear.
For each setup, high-speed video cameras captured subtle movements in the mice’s ears, noses, whiskers, & other parts of the face. Once the facial expressions were recorded, the team turned to computer vision technology that captured the images and evaluated the difference in the picture.
After that, the researchers used a machine-learning-based software to understand facial recognition of the mice, which could pinpoint neurons in the human brain to encode specific expressions.
The software revealed the differences in the photographs taken before and after the study. It also showed the different expressions of mice with different stimuli, such as electric shock and sweet and bitter substances.
Observers can generally see that something is happening on the mouse’s face, Gogolla says. But translating those subtle clues into emotions is really hard, “especially for an untrained human being,” she says.
The software revealed clear differences between images of mice taken from before and after each trigger was applied, as well as differences between images captured during different types of the trigger, suggesting the facial expression linked to a zapped tail was different from that manifested when a treat was given.
For example, the face of a mouse drinking sweet water and presumably happy about it, the ears move forward and fold at the back toward the body and the nose moves down toward the mouth. A mouse tasting bitter quinine sends its ears straight back, and the nose curls slightly backward, too.
The activity of nerve cells in the mice’s brains also changed with distinct emotions, other analyses showed. These cells reside in the insular cortex, a deeply buried spot known to be involved in human emotions, too.
By prodding these cells to fire signals, the researchers could prompt the mice to display certain facial expressions. These connections among brain activity and facial expressions may lead to insights about the neural basis of emotions, and what goes awry in disorders such as anxiety, the researchers suggest.
The system was first fed with facial expressions from the mice, labeled with the corresponding emotion. When it was subsequently presented with unlabelled facial images, it predicted the emotions captured within them with more than 90% accuracy. “The expressions are very similar between mice,” said Gogolla.
The team found that expressions could vary in duration and onset. while Mice pulled a stronger expression, relative to the average, when they were given a sweet sucrose drink when thirsty compared with when they were quenched. “It is not just a sucrose face we are eliciting,” said Gogolla.
“The pleasure is higher when you are actually thirsty.”
Link to paper:
More in AI