AI is learning to talk to wild animals. What could that mean for their welfare?

May 31, 2025

Every animal welfare scientist has probably thought at some point in their career, “My job would be a lot easier if the animals could just tell me how they’re feeling.”

The new — and booming — field of interspecies communication asks: What if they could?

Wild animal welfare scientists use a range of indicators — usually based on physiology or behavior — to make inferences about animals’ affective states. But if we could communicate with animals directly, perhaps one day these proxies for welfare would become obsolete.

Animal communication has been studied for decades using observational and experimental methods. Typically, researchers record vocalizations and other signals, observing whether they correlate with behavior to gather clues about how and why they are used. Scientists have conducted playback experiments to test how animals respond to various sonic cues, and even deliberately made themselves the subjects of animals’ communications. We can thank conventional animal communication research for the discoveries that honey bees’ dance performances convey information about the location of food sources, and that Bengalese finches use grammar — a learned ability.

But artificial intelligence (AI) is shaking up the field of animal communication.

The key breakthrough came in 2017, when researchers developed machine learning models capable of translating human languages where no words were already known. The models were trained to turn the languages into shapes, placing words that frequently appeared together at nearby geometric points. Astonishingly, when the language-shape of an unknown language was overlaid with the language-shape of a known language, they lined up. Any word in the unknown language could then be translated by locating its corresponding point in the known language’s shape.

These unsupervised machine translation models have since been used to translate between languages as different to one another as English, Mandarin, and Urdu, and to work on audio recordings as well as written language.

Researchers now believe that, once they gather enough recordings of animal vocalizations and behaviors, they may be able to use unsupervised machine translation to “decode” the vocalizations of any species.

And once we understand the meanings of these signals, AI speech generators could enable us to respond.

Is AI animal language translation ethical?

Whether we should respond is a subject of fierce debate.

Some of the thorny questions surrounding interspecies communication are those that animal communication researchers have always grappled with: Is it an invasion of privacy to record animals without their consent? What are the welfare impacts of broadcasting the sounds of predators to songbirds to test how they respond, and can they be justified by the knowledge a study will yield? What welfare concerns might arise from playing animals signals whose meanings we don’t yet know?

This is something that especially worries Leonie Bossert, an ethicist at the University of Vienna, who specializes in AI and animal ethics. “Animal communication is highly complex, and has evolved over hundreds of years,” she says. “Communication is vital for individual animals and their community life, so confronting animal communities with digital sounds could cause confusion.”

 

Dolphin Whispers is exploring ethical technology protocols to support human/dolphin communication research.

 

Some researchers have already found ways to do this as cautiously as possible. For example, the researchers at Dolphin Whispers, a project exploring human/dolphin communication, believe that there are ways to conduct interspecies communication research ethically. Founder Julio Raldúa Veuthey says that the digitalized whistles Dolphin Whispers uses to study dolphin reactions and behavior patterns are “intended to be recognizable as artificial or human-generated, to avoid any confusion or misinterpretation by the dolphins.” They’re only used in “structured test scenarios, with minimal repetition and under strict ethical protocols.”

Guidelines for AI animal translation technologies

But a new question has arisen in direct response to the AI arm of the field, and the stakes may be higher: When these technologies are fully operational, how will they be used?

Industries such as ecotourism and industrial fishing rely on proximity to wild animals, and it isn’t hard to imagine how they might benefit from the ability to interact with them using their own communication systems. In some scenarios, the technologies might not be used for the animals’ benefit — luring whales toward noisy boats, or fish into trawler nets, for example.  

Bossert believes that there’s also a possible future in which these technologies are used to make poaching and hunting more efficient. It’s for this reason that she’s calling for ethical guidelines dictating that “AI should not be implemented to benefit activities or industries that systemically overlook animals’ interests in avoiding suffering and continuing to live.” Veuthey, too, would like to see regulatory frameworks and legal measures developed to “prevent misuse and ensure accountability in the application of AI for interspecies communication.”

Initiatives like the More Than Human Life Project are already developing guidelines for the ethical use of “nonhuman animal communication technologies” — though it’s uncertain whether such guidelines would eventually become law.

Could interspecies communication help wild animals?

Fortunately, there are also hopeful possibilities for how interspecies communication could positively impact wild animal welfare. 

Aza Raskin, cofounder of the Earth Species Project — the organization at the forefront of efforts to decode animal communication — has argued that the ability to understand animal signals might trigger a seismic shift in human-nonhuman relations. Discovering that humpback whales sang at all was enough to inspire thousands of people to reconsider the moral significance of whales at a pivotal time in the anti-whaling campaign. How might decoding their songs — knowing precisely what information they convey — force us to rethink the moral status of nonhuman animals and our duties to them?

A deeper understanding of animal communication could reveal and help us compare sources of suffering. Perhaps it could also facilitate interventions that improve their welfare. By interacting with animals using their own communication systems, could we warn them to avoid harms in their environment? Could we direct wild animals to medical care, or to shelter? Could we simply ask them how they’re feeling and, if they’re suffering, how they would like us to help?

Research directions in interspecies communication

With the subfield of AI for interspecies communication in relative infancy, these applications are still entirely speculative. But the research being conducted in earnest now — by a cohort of organizations including Dolphin Whispers and the Earth Species Project — is driving them towards reality. And there are already ways for wild animal welfare scientists to get involved.

 

Ecological 'soundscape' analyses are, at present, largely focused on species detection, but it should be possible to listen in on animals' welfare at the landscape level.Rutz et al., 2023.

 

Interspecies communication research is interdisciplinary: Veuthey calls for collaborative research, “involving scientists and behavioral experts at every step.” Zoologists, ecologists, and veterinarians can offer practical skills in biologging, camera traps, audio recorders, and drones — methods of collecting the data on communication and behavior that is necessary for interspecies communication research. Ethicists can help to ensure that research is conducted responsibly, while wild animal welfare researchers can ensure that these tools are being used to gather welfare-relevant data. 

AI can assist researchers in answering questions about wild animal welfare more accurately and efficiently — picking up on signals that are undetectable to humans, avoiding biases, and processing large volumes of data quickly. Borrowing the technologies used to decode and respond to animal communications could further accelerate and enrich the field.

In a 2023 paper in Science, Rutz et al. explain how machine learning approaches to animal communication research “could be used to identify animal signals that are associated with stress, discomfort, pain, and evasion, or with positive states.” The authors write that it should be possible to “listen in on [wild] animals’ welfare at the landscape level” using machine learning–enhanced acoustic monitoring methods. And these analyses can be complemented by the development of other machine learning methods that “examine satellite-recorded animal movement tracks for signatures of disease, distress, or human avoidance.”

Which animals will we be able to talk to?

Much interspecies communication work has centered on cetaceans, but learning to "speak whale” isn’t the only possibility. Technologies are now emerging that can be used to study the communication systems of any species that uses auditory signals — terrestrial mammals, birds, amphibians, and even insects. Decoding the signals of species belonging to highly abundant taxa like these has especially strong potential for positive welfare impact.

How soon will we be able to understand animals?

Interspecies communication research has much further to go to understand the full scope of nonhuman animal communication, which is staggeringly complex. Many species use gestures, smell, and touch to communicate; still others use thermal, chemical, and electrical cues. Models that have prioritized acoustic signals simply because they’re so central to human communication may need to abandon this bias if they are to solve this puzzle. But if machine learning models are eventually able to map and decode multimodal animal communication systems, the data they provide may be much more comprehensive, nuanced, and widely applicable than what researchers have so far been able to infer about communication and welfare.

It’s also distinctly possible that other species’ Umwelten — subjective experiences — are so different to those of humans that our translation models simply won’t work on them. Bossert, for one, is skeptical that AI will ever truly enable us to have a two-way dialogue with nonhuman species. “Even if we manage to produce sounds that carry meaning for the animals, there is a high likelihood that we won’t fully understand what we’re saying to them,” she says.

Still, she remains open to the possibility that, when it arrives, interspecies communication could be a force for good: “I hope that understanding animals better will lead to a shift in how we view them, so that more and more people recognize how fascinating these beings are. Ultimately, I hope it will help us build more respectful human-animal relationships.”

Shannon Ray

Shannon is Science Writer & Editor at Wild Animal Initiative. Shannon studied English at Rutgers University, then earned her MSc in Biodiversity, Conservation and Management from the University of Oxford. Before joining WAI, she worked for an online research institute and wrote for multiple organizations, including a climate research initiative, an ocean conservation pressure group, and a farmed animal welfare charity. Shannon lives in England.

Next
Next

Core Concepts: Maximizing welfare