Close Menu
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On

Vivo Y400 5G: Launch Date, Expected Price in India and Specifications

2 August 2025

Uber’s Drive to Become the Kleenex of Robotaxis

2 August 2025

Silent Hill F Impressions And Ninja Gaiden: Ragebound | The Game Informer Show

2 August 2025
Facebook X (Twitter) Instagram
Just In
  • Vivo Y400 5G: Launch Date, Expected Price in India and Specifications
  • Uber’s Drive to Become the Kleenex of Robotaxis
  • Silent Hill F Impressions And Ninja Gaiden: Ragebound | The Game Informer Show
  • WIRED’s Guide to Mouth Tape
  • Anthropic Revokes OpenAI’s Access to Claude
  • Tesla Found Partly Liable in 2019 Autopilot Death
  • Confessions of a Recovering AI Porn Addict
  • Vivobarefoot’s Sensus Shoes Are Like Gloves for Your Feet
Facebook X (Twitter) Instagram Pinterest Vimeo
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release
Subscribe
Best in TechnologyBest in Technology
Home » The Race to Translate Animal Sounds Into Human Language
News

The Race to Translate Animal Sounds Into Human Language

News RoomBy News Room22 December 20244 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

In 2025 we will see AI and machine learning leveraged to make real progress in understanding animal communication, answering a question that has puzzled humans as long as we have existed: “What are animals saying to each other?” The recent Coller-Dolittle Prize, offering cash prizes up to half-a-million dollars for scientists who “crack the code” is an indication of a bullish confidence that recent technological developments in machine learning and large language models (LLMs) are placing this goal within our grasp.

Many research groups have been working for years on algorithms to make sense of animal sounds. Project Ceti, for example, has been decoding the click trains of sperm whales and the songs of humpbacks. These modern machine learning tools require extremely large amounts of data, and up until now, such quantities of high-quality and well-annotated data have been lacking.

Consider LLMs such as ChatGPT that have training data available to them that includes the entirety of text available on the internet. Such information on animal communication hasn’t been accessible in the past. It’s not just that human data corpora are many orders of magnitude larger than the kind of data we have access to for animals in the wild: More than 500 GB of words were used to train GPT-3, compared to just more than 8,000 “codas” (or vocalizations) for Project Ceti’s recent analysis of sperm whale communication.

Additionally, when working with human language, we already know what is being said. We even know what constitutes a “word,” which is a huge advantage over interpreting animal communication, where scientists rarely know whether a particular wolf howl, for instance, means something different from another wolf howl, or even whether the wolves consider a howl as somehow analogous to a “word” in human language.

Nonetheless, 2025 will bring new advances, both in the quantity of animal communication data available to scientists, and in the types and power of AI algorithms that can be applied to those data. Automated recording of animal sounds has been placed in easy reach of every scientific research group, with low-cost recording devices such as AudioMoth exploding in popularity.

Massive datasets are now coming online, as recorders can be left in the field, listening to the calls of gibbons in the jungle or birds in the forest, 24/7, across long periods of time. There were occasions when such massive datasets were impossible to manage manually. Now, new automatic detection algorithms based on convolutional neural networks can race through thousands of hours of recordings, picking out the animal sounds and clustering them into different types, according to their natural acoustic characteristics.

Once those large animal datasets are available, new analytical algorithms become a possibility, such as using deep neural networks to find hidden structure in sequences of animal vocalizations, which may be analogous to the meaningful structure in human language.

However, the fundamental question that remains unclear is, what exactly are we hoping to do with these animal sounds? Some organizations, such as Interspecies.io, set its goal quite clearly as, “to transduce signals from one species into coherent signals for another.” In other words, to translate animal communication into human language. Yet most scientists agree that non-human animals do not have an actual language of their own—at least not in the way that we humans have language.

The Coller Dolittle Prize is a little more sophisticated, looking for a way “to communicate with or decipher an organism’s communication.” Deciphering is a slightly less ambitious goal than translating, considering the possibility that animals may not, in fact, have a language that can be translated. Today we don’t know just how much information, or how little, animals convey between themselves. In 2025, humanity will have the potential to leapfrog our understanding of not just how much animals say but also what exactly they are saying to each other.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleIf you need a relaxing holiday vacation game, download this one
Next Article The 5 best TV performances of 2024, ranked

Related Articles

News

Uber’s Drive to Become the Kleenex of Robotaxis

2 August 2025
News

WIRED’s Guide to Mouth Tape

2 August 2025
News

Anthropic Revokes OpenAI’s Access to Claude

2 August 2025
News

Tesla Found Partly Liable in 2019 Autopilot Death

1 August 2025
News

Confessions of a Recovering AI Porn Addict

1 August 2025
News

Vivobarefoot’s Sensus Shoes Are Like Gloves for Your Feet

1 August 2025
Demo
Top Articles

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024103 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views

Oppo Reno 14, Reno 14 Pro India Launch Timeline and Colourways Leaked

27 May 202582 Views

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Latest News
News

Tesla Found Partly Liable in 2019 Autopilot Death

News Room1 August 2025
News

Confessions of a Recovering AI Porn Addict

News Room1 August 2025
News

Vivobarefoot’s Sensus Shoes Are Like Gloves for Your Feet

News Room1 August 2025
Most Popular

The Spectacular Burnout of a Solar Panel Salesman

13 January 2025126 Views

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024103 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views
Our Picks

WIRED’s Guide to Mouth Tape

2 August 2025

Anthropic Revokes OpenAI’s Access to Claude

2 August 2025

Tesla Found Partly Liable in 2019 Autopilot Death

1 August 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Best in Technology. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.