It sounds like the plot of a new Disney movie, but experts predict that AI will allow people to communicate with household pets and even wild animals.
Researchers around the world are using ‘digital bioacoustics’ (small portable digital recorders) to capture the sounds, tics and behaviors of animals that are too quiet or nuanced for humans to pick up.
These databases will be used to train artificial intelligence to decipher these miniature communications and translate them into something more understandable for us, almost like a ‘ChatGPT for animals’.
Projects like the Terrestrial Species Project expect progress in the next 12 to 36 months.
A researcher hopes to unravel the language of dogs
Founded in 2017, the AI nonprofit aims to record, understand and respond to animals, from cats and dogs to more unusual species like whales and crows.
The Earth Species Project’s current experiments include attempts to map the vocal repertoires of crows and another experiment that aims to generate new vocalizations that the birds can understand.
Aza Raskin, one of the co-founders of the Earth Species Project, believes that generating animal vocalizations could take as little as a year.
Raskin told Google: ‘Can we make novel, generative animal vocalizations? We think that, in the next 12 to 36 months, we will probably be able to do this for animal communication.
‘Could you imagine if we could build a synthetic whale or crow that spoke whale or crow in a way that they couldn’t tell they weren’t talking to one of their own.
“The plot twist is that we may be able to engage in a conversation before we understand what we’re saying.”
Below are some of the other projects aimed at achieving intelligible communication between people and animals:
Could AI help us understand what cats say? (aim)
The cat ate your tongue?
Artificial intelligence could finally unravel a mystery that has haunted the human race for centuries: what do cats really think?
Researchers at the University of Lincoln are using AI to categorize and understand cats’ expressions.
Professor Daniel Mills said: “We could use AI to teach us a lot about what animals are trying to tell us.”
AI can learn to identify features like the position of cats’ ears, which could help classify and understand the hundreds of expressions cats use to communicate.
Similarly, a new AI model aims to translate dogs’ facial expressions and barks.
Its creator, Con Slobodchikoff, author of Chasing Doctor Dolittle: Learning the Language of Animals, told Scientific American that when we understand animals, we can reveal surprising facts.
“Animals have thoughts, hopes and perhaps dreams of their own.”
Bats have a much more complex language than people thought: they have names, they argue about food, and mother bats use “baby language” when talking to children.
That’s the conclusion of a pioneering artificial intelligence study that used a voice recognition program to analyze 15,000 bat calls, with an algorithm that correlated the sounds with videos of what the bats were doing.
Yossi Yovel, from Tel Aviv University, told the BBC: “I’ve always dreamed of a Doolittle machine that would allow me to talk to animals. Specifically, I’m interested in that vocal communication.
‘By teaching the computer how to define between different sounds and how to recognize what each sound means when you can hear it. We teach the AI how to differentiate between different sounds.’
“Eventually, the computer will be able to speak the language to understand what they are saying to each other.”
Researchers now know that bats “argue” over food and that baby bats repeat what their mother “says” to learn language.
‘Deep learning’ is capable of deciphering bat language (which is largely ultrasonic and much faster than human speech); humans can’t hear it, but computers can.
Yovel is skeptical that a “decoder” that can instantly translate for bats will arrive “in their lifetime,” but his goal now is to understand bats’ long-term social interactions.
Clicking with the whales
Microphones placed on buoys and robotic fish try to unravel one of the most famous “voices” in the animal kingdom: the song of whales.
Sperm whales are the world’s largest predators and locate their food using clicks, but they also use shorter series of clicks called “codas” to communicate with each other.
The CETI Project team is placing microphones on whales to capture huge amounts of data, with the goal of using machine learning to unravel what these huge animals are saying.
A project aims to unravel the ‘codas’ of sperm whales (Getty)
To connect the microphones, the team uses a ten-meter pole.
The AI team is already able to predict codas (click sequences) of whales with up to 95 percent accuracy, and now hopes to increase the volume of data to establish even more patterns and discover what they say.