Mr Waffles is a small dog with a big attitude. The Yorkshire terrier, who has more than a million followers on TikTok, is one of a growing number of dogs who communicate with their owners by pressing electronic buttons.
The paw-sized devices can be programmed to say different words, such as ‘walk’ or ‘hungry’, but they can also be loaded with profanities.
So, when Mr Waffles ‘says’ he’s hungry and his owner disagrees, the dog replies by pressing “that’s b******t!” Laughter erupts and Mr Waffles clocks up another thousand likes.
Don’t be fooled, though. Button pressing is a serious business. With the exception of Mr Waffles, researchers have found that many dogs do respond appropriately to the words in their buttons.
‘Outside’ is met with a look to the door. ‘Play’ prompts tail wags. Critically, it doesn’t matter who presses the button – an owner or an unrelated person – or whether the owner says the word. The dog still seems to ‘get it.’
According to Dr Federico Rossano from the University of California San Diego in the US, who conducted the research, this “shows that words matter to dogs, and that they respond to the words themselves, not just to associated cues.”
It’s impressive, but while the study confirms that dogs can recognise and respond to verbal cues, critics argue that it doesn’t tell us what those words mean to the dog.
If we really want to be able to communicate with animals, we need to do it on their terms. From the squeaks of bats to the whistles of dolphins, this means finding ways to decode animal communication in all its many varied forms.
Luckily for us, scientists have been on the case for decades, studying the behaviour and sounds of animals in the wild. Now, with the help of a rapidly growing toolkit of technology, exciting discoveries are being made.
There’s even an annual award – aptly titled the Coller-Dolittle Prize – for significant steps towards interspecies communication.
As a result, we find ourselves on the cusp of not only decoding animal communication, but of being able to converse with them meaningfully.
Here’s a little taste of what we have learned so far…
Whistle while you work
Just as people have accents, many animals have regional dialects.
Orcas, humpback whales, gibbons, hyraxes and many songbirds sing region-specific songs, which help them to identify individuals from their population, and can sometimes help with attracting mates.
We also know that certain animals use ‘names’. Elephants use harmonically rich, low-frequency calls to address key members of their group.
Marmosets use distinct ‘phee’ calls to communicate with specific individuals, while bottlenose dolphins address each other via ‘signature whistles’ which are created during infancy and remembered across decades.
Something once thought of as unique to humans isn’t, but we shouldn’t be surprised. These are all social species that cooperate with each other to find food, raise their young, and make it through the day.
Being able to communicate with specific individuals helps them to do this, but this is just the tip of the animal communication iceberg.

Dr Laela Sayigh from the Woods Hole Oceanographic Institution, Massachusetts, in the US, is part of a team that has spent decades studying the resident bottlenose dolphins of Florida’s Sarasota Bay.
From time to time, the animals are caught and given a quick health check. While this happens, hydrophones are attached to their foreheads, and their vocalisations are recorded.
As a result, the team have logged more than 250 distinct signature whistles.
The dolphins use these signature whistles to broadcast their identity and to call to each other.
Poignantly, when mother dolphins call to their calves, they modify their signature whistles, exaggerating the frequency range – the highs get a bit higher, and the lows get a bit lower.
This is similar to how human mothers talk to their babies. “It’s a dolphin version of motherese,” says Sayigh. It is thought to help the youngsters learn the sounds they need for adult life.

Signature whistles are important, but in the last few years, Sayigh has learned that the dolphins do far more than just shout names.
Trawling through more than a thousand hours of vocalisations, Sayigh’s team realised that around half the whistles made by free-swimming dolphins were not signature whistles.
They identified 20 new whistles, each used by multiple dolphins. This suggested that the whistles were being used for communication, but what were they communicating?
To find out, Sayigh performed a tried-and-tested method for decoding animal communication.
It turns out that researchers have, in fact, been ‘talking’ to animals for decades, in the form of playback experiments, where recordings of vocalisations are broadcast back to animals to see what they do.
In this case, the recorded sounds were broadcast underwater while drones monitored the dolphins’ behaviour from above.
One of the whistles, dubbed Type A, made the dolphins swim away, while another, dubbed Type B, made them swim closer to investigate. “So, we think the first one acts as an alarm call, and the second one acts as a query or a 'what' question,” says Sayigh.
The findings suggest that dolphin communication is much richer than previously thought. Dolphins may possess a language-like communication system, with units of sound that have shared, context-specific meanings.
“Our work shows that these whistles could potentially function like words,” says Sayigh.
Read more:
- Can animals play video games?
- Here's how to tell if your dog actually loves you, according to science
- Some (very adorable) whales and dolphins were caught forming unlikely friendships
Turning data into dialogue
In May 2025, Sayigh’s team won the inaugural Coller-Dolittle Prize for accelerating progress towards interspecies two-way communication.
The runners-up included a group from Paris, who discovered that cuttlefish communicate by waving their arms in expressive patterns, and a team from Germany who developed an artificial intelligence (AI) model that can generate and analyse nightingale songs.
Sayigh’s team received $100,000 (approx. £73,700) to put towards their work, but there is a bigger cash prize of $500,000 (approx. £368,600) available to anyone who can devise an algorithm that enables humans and animals to communicate directly, without the animal realising that it is talking to a person.
The challenge is inspired by the Turing test, which involves a chatty computer fooling a person into thinking they are talking to another human.
AI models and the algorithms they use have now become the latest tool in the race to decipher animal communication.
Just as the Rosetta Stone (an artefact containing one piece of text written in three different languages) helped linguists to understand Egyptian hieroglyphs, so computer scientists think that AI can be used to understand – and maybe even generate – animal communication.

A key problem for biologists is that although they may have big datasets of animal vocalisations, they don’t always have the tools or time to analyse them.
It’s also easy to miss potential communication signals when you don’t know what you’re looking for, and even harder to work out what those signals mean.
This is where AI can help.
AI uses algorithms to munch data and spot patterns. It’s the same technology that enables ChatGPT to engage in human-like conversations and pass the Turing test. “AI is definitely making things easier,” says animal communication expert Yossi Yovel.
Yovel has been using AI to decode the squeaks of Egyptian fruit bats. Fruit bats are another chatty, social species. They live in colonies ranging from a few dozen to a few thousand individuals.
One of these is at Tel Aviv University, in Israel, where Yovel works. During the day, the bats hang out in their ‘cave’ – what is in fact a dark room with a tunnel that leads to the outside world. Then at dusk, they fly off into the big night sky.
“The bats are free to come and go as they choose,” says Yovel. At any one time, there are around 30 to 60 individuals, all fitted with tiny recording devices worn as collars.
The bats squabble. A lot.
Most of their calls are arguments over food, who gets the best spot to roost, and who does or doesn’t want to mate. The devices record every chirp, squeak and niggle, and when they are in the roost, the animals are filmed too.

Over a period of two and a half months, Yovel recorded 15,000 vocalisations. Then he fed the audio and video data into a machine learning package to see what it could do.
The AI did not disappoint. It was able to detect specific vocalisations and predict what sort of argument the bats were having when they made them.
This shows that the bats aren’t just making random noises. Their calls are context-specific, which suggests that they have meaning.
Additionally, the calls also encoded information about the caller.
“We don’t think these are names,” says Yovel, “but we do think the sounds could be encoding something about social status.”
The AI spotted patterns that humans had not, cutting through the noise to find signals that are salient.
Lost in translation?
Thousands of miles away, in the Caribbean, another AI-aided project is generating excitement. Researchers working on Project CETI (the Cetacean Translation Initiative) are using the technology to translate the vocalisations of sperm whales.
Sperm whales live in tight-knit, female-led groups of seven or eight members. Individuals communicate via sequences of three to 40 clicks, known as codas.
Each of these groups belongs to a larger ‘vocal clan’, which uses codas that are unique to that clan.
Researchers have identified 21 codas used by the Caribbean clan, but recently they took the same dataset used to find these codas and fed it to a hungry AI model.
In the blink of an eye, the technology spotted patterns they had missed, including differences in the intervals between certain clicks, and the occasional extra click.
Borrowing terms from the musical world, they called these features ‘rubato’ and ‘ornamentation’ respectively. All in all, 156 distinct codas were identified.

“This adds complexity to the system,” says Shane Gero, who is Biology Lead for Project CETI. Gero describes the variation as being like a “sperm whale phonetic alphabet”, which the whales might be using to convey complex information.
Elsewhere, researchers are feeding human language and even music into the models being used to decode animal sounds. It is hoped that the patterns encoded in these very human media will help AI to spot patterns in the vocalisations of other species.
An example of this is NatureLM-Audio, the flagship AI model of the Earth Species Project, a non-profit organisation using machine learning to promote interspecies understanding.
“Teaching the model human language and music first helps it then understand animal communication,” says Aza Raskin, co-founder and president of the project.
“Already, it can correctly identify species by name, even when the model has never heard that species before.”
With enough data under its belt, Raskin believes that the project’s powerful computer models will not just be able to decode animal communication, but also predict what the animals will say next, and perhaps even draft responses to them.
“Without scale, translating animal communication is impossible,” says Raskin. “With scale, it is inevitable.” The scale of data available means we may finally achieve a digital Rosetta Stone for animal language.
So, the question then is, what should we say?

Here, we must be cautious. Say the wrong thing, and we run the risk of causing distress. Our communications have the potential to disrupt complex social relationships, cultural traditions, and ecosystems that have been millions of years in the making.
But they could also do good. “The ultimate goal is to transform our relationship with the natural world so that the diversity of life can thrive,” says Raskin.
Back in the 1970s, marine biologist Dr Roger Payne recorded whale song and played it to the world for the very first time.
People were so moved that it led to the creation of the Save the Whales movement, which in turn led to the ban on commercial whaling.
Now, those using modern methods to decode animal communication today hope their efforts too will boost conservation across the tree of life.
So, a decade from now, suppose an industrial trawler drops a hydrophone into the water and, through the power of AI, hears a whale communicating that they are in distress. What will the fishermen, the politicians and the consumers do?
Although we may find ourselves in a position where we can communicate with animals, the bigger question is: will we be prepared to listen?
Read more:
