Google’s Machine Learning Model Decode Humpback Whale Songs

Researchers at National Oceanic and Atmospheric Association (NOAA) and Google have created an Artifical Intelligence system that is helping scientists to identify humpback whale calls songs.

This humpback whale calls songs are collected by underwater recordings from many different locations at the Pacific Islands Fisheries Science Center in Honolulu in the past 15 years.



Researchers have devised this technique to get a better understanding of population numbers and travel patterns of endangered marine life, like Humpback Whales, by listening to their peculiar hour-long song.

This new research provides totally different and important information about humpback whale presence, seasonality, daily calling behavior, and population structure.




“As a researcher, if I was looking at a spectrogram of whale calls, even if I’d never heard that specific song before, I could tell it was a humpback whale,” Allen said. “So, give the computer enough examples of humpback whale songs” and it can identify the songs.

Activities like pollution, dumping, overfishing, over-exploitation of resources have a dangerous impact on marine ecosystems. Scientists believe that techniques like this can really help us to get a better understanding of our marine system.

This system is little different from normal ones, it teaches it’s self how to learn instead of giving it step-by-step instructions. The system uses machine learning to identify the specific calls of a humpback. The machine learning system analyzes 170,000 hours of acoustic data.




Spectrograms of audio events found in the dataset, with time on the x-axis and frequency on the y-axis. Left: a humpback whale call, Center: narrow-band noise from an unknown source, Right: hard disk noise from the HARP

The audio analyzing technique used in it is developed by Google’s Artificial Intelligence Perception team. You know want, you have already experienced this technique, it’s the same one that YouTube videos machine learning model used for non-speech captions. And, now similar techniques are being used for conservational activities.

Google has used HARP (high-frequency acoustic recording packages) devices to collect audio data (9.2 terabytes) over a period of 15 years. And for image detection, Google’s have also developed a supervised machine learning models to optimize images for detecting the whales that avoid manual marking of humpback whale calls.

This data of different magnitudes of sound intensities are plotted on time-frequency axes. Spectrograms of audio events found in the dataset, with time on the x-axis and frequency on the y-axis. Left: a humpback whale call, Center: narrow-band noise from an unknown source, Right: hard disk noise from the HARP.



For classifying the images, a Resnet-50 was used which has given reliable results in classifying non-speech audio.

Google's Machine Learning Model Can Decode Humpback Whale Songs 2

Humpback whales have a varied, but sustained frequencies. These frequencies, if don’t vary at all then a spectrogram would display a horizontal bar. The arcs mean that the signals have been modulated.

The challenge with collecting humpback’s audio data is the noise that gets mixed up along with it; noise caused by the propellers of the ships and other equipment. This noise is taken as an unvaried signal and displayed as a horizontal bar on a spectrogram.

A whale song is generally a structure, sequential audio signal that can last over 20 minutes. And, there is a high possibility of a new song beginning within a few seconds. This incoming audio units with such large time windows give extra information useful for predicting with improved precision. The test-set consists of 75-second audio clips, for which the model showed accuracy scores above 90%.

Off the Kona Coast, though, NOAA has a HARP location that’s been monitored for the past 10 years. AI is doing well at identifying in that area and NOAA is now actively collecting data to compare the Kona site to a HARP site in the Northwest Hawaiian Islands.

The purpose, Allen said, is “to see if the reduction in numbers is them moving up to the Northwest Hawaiian Island or if it’s an actual decrease in numbers visiting.”

Looking long-term, Allen said the idea is to develop a new tool to get data from new sites, new songs, and new information.

“An integrated tool would be revolutionary for our field,” Allen said. “It’s a very powerful tool and there’s so much we can learn from it.”

More in AI :

Carv The World’s First AI Ski Instructor

Economic Value Of Artificial Intelligence, It’s Growth and Impact

IDnow Launches AI-Powered ID Verification Capable of Identifying 7 Billion People

MIT’s New AI Algorithm Can See People Through Walls



Leave a Reply

Your email address will not be published.