Out of more than eight million species that live on the planet, only humans can understand the language of one. After decades of searching for ways to communicate with animals, several scientists have turned to artificial intelligence to detect patterns in their sounds and behavior, trying to understand their intentions and to interact with them. However, despite the promising advances of multiple investigations, creating translators for elephants, dogs or whales poses multiple challenges.
Eva Meijer, author of Animal Languages: The Secret Conversations of the Living World, explains that animals talk all the time – both among themselves and in multispecies environments, in order to survive, make friends, discuss social rules and even flirt. The scientific evidence, points out the expert, shows that they have languages, cultures and complex internal lives, that they fall in love and mourn their partners.
As she explains in her book, dolphins call each other by their names, prairie dogs describe intruders in great detail, bats love to gossip and grammatical structures can be found in the songs of some birds. Wild chimpanzees understand each other through dozens of different gestures and bees dance to communicate, and can recognize and remember human faces.
Studying the language and behavior of animals is not only important to learn how they communicate with each other, but also to find out how they communicate with us. Some, like dogs, birds and horses, are even capable of learning words: according to a study published in the journal Behavioral Processes, a border collie can memorize more than a thousand. In addition, some animals respond to tone of voice and body language, explains Melody Jackson, professor at the Georgia Institute of Technology and an expert on dog-computer interaction: soft tones convey friendship, while harsh or strong tones can be threatening. Touch can also be used as a reward with dogs and horses.
Artificial intelligence to “talk” to animals
Multiple scientists have turned to artificial intelligence and other technologies in order to understand and improve this communication. Clara Mancini, a researcher in animal-computer interaction at the Open University in the United Kingdom, explains that sensors can be used to record, analyze and interpret many different animal signals, including those that could be difficult for human ears to detect.
The promoters of the Wild Dolphin Project have spent more than 30 years collecting a database of dolphin behaviors, as well as their sounds, of which there are three: whistles for long distance communication and as contact calls between mothers and calves when they are separated, clicks for orientation and navigation, and what is known as burst pulses, which are packets of clicks spaced tightly together that are used during close proximity social behavior such as fighting. The goal of this project is to create machine learning algorithms to find patterns in these sounds and develop systems that can generate “words” to interact with dolphins in wild environments.
There are many similar projects. The researchers of Elephant Voices have created an online ethogram of vocalizations and behaviors from elephants in Kenya and Mozambique, where they include examples like the trumpet sounds they usually make when they come out of the water after playing. Another team has developed software that can automatically detect, analyze and categorize the ultrasonic vocalizations of rodents; it is called DeepSqueak and it has also been used on lemurs, whales and other marine animals. Some scientists have developed systems to detect distress calls made by chickens, while others are trying to understand dogs using machine learning to determine if their whining is expressing sadness or happiness.
The challenges of creating “translators”
Although some researchers have identified the structure and part of the meaning of the vocalizations of some animals, creating “translators” entails multiple challenges. First of all, understanding the semantic and emotional meaning of what they communicate is a highly complex task, as Mancini points out: we are not in their minds, and we do not have the same physical, sensory and cognitive characteristics through which they experience the world. If these differences and complexities are not taken into account, their messages can be trivialized – and misinterpreted.
Added to this is the fact that current technologies require environmental or portable sensors that are not always practical. Sometimes the appropriate cameras are not available, and filming an animal in motion well enough for video analysis can be very difficult. In addition, interpreting their communication based solely on vocal expression leaves out other channels that could be relevant in understanding what they mean, such as their behavior.
Animals also communicate through their actions, their gestures and even their facial expressions. For example, if two groups of elephants get together and fold their ears while rapidly waving them, they are expressing a warm greeting that is part of their welcoming ceremonies, according to Elephant Voices. And sheep can use facial expressions to express pain. In fact, computer scientists from the University of Cambridge have developed an artificial intelligence system that analyzes their faces to detect when they are hurting.
Some researchers study the postures and behaviors of dogs to predict how they are feeling, sometimes resorting to biometrics to try to pinpoint changes in heart rate, breathing and temperature that could provide clues to their emotions, Jackson says. Some of these canine interpretation systems use body sensors to measure position and movement, and others use cameras to record and analyze the videos.
Vests for training dogs and robotic bees
Being able to communicate with animals can be useful in multiple contexts. Jackson’s team, for example, has developed technology that allows a human handler to guide a search-and-rescue dog remotely using vibrating motors attached to a vest. They have also created portable computers that allow a service dog to contact emergency services with a GPS location if its owner is having a seizure.
Humans may never be able to sing like a whale or buzz like a bee, but perhaps machines can. In fact, a team of German researchers has made a biomimetic robot called RoboBee that imitates the dances that bees use to communicate, and the results have been successful: with this robot, they claim to have managed to recruit real bees and guide them to specific locations.
The advances are quite promising. However, it is still too early to predict if animal translators will ever exist. Jackson believes that as computers and sensors become smaller and more capable, tiny implantable systems will be developed to provide more clues about their behavior and one day achieve a real two-way communication.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition