Google built a neural network to warn ships of whales. will it help?

Google trained an artificial neural network to locate whales and alert ships — cautioning them to slow down. Getting them to do it is another story.

In 2018, researchers from the National Oceanic and Atmospheric Administration (NOAA) approached Google and asked them if they could make sense of hours and hours of underwater recordings. The audio files contained the sounds of dolphins, ships, and low-frequency sonar.

Amidst those sounds — sometimes indistinguishable from the rest of the underwater noise — were the calls of southern resident orcas, a population of killer whales in the Salish Sea (the coastal waterways near Puget Sound).

The NOAA team wanted to know if Google could find a way to identify the orcas amidst the cacophony of noise. So engineers at Google began working on an artificial intelligence model that could identify the whales by their calls, reports the New York Times. They can use this system to locate injured or lost whales and alert shipping traffic when a whale is present in their area, cautioning them to slow down or alter course.

Ship Strikes Threaten The Whales

The southern resident orcas are a distinct population of killer whales. They live in the northwest waterways surrounding Vancouver and Seattle. The population is listed as endangered under the Canadian Species at Risk Act — only 72 remain in the wild.

Like other whales, this population suffers from a dwindling food supply (chinook salmon), climate change, and noise pollution — anthropogenic underwater noise that interferes with the whale’s ability to echolocate, use sound to navigate and find food. But collisions with ships are one of the largest threats to the southern resident killer whales.

There is a good reason to be especially concerned about the animals. Most of their essential habitat overlaps with major shipping channels. With industrial developments in the works — like a port expansion and oil pipeline terminal — large ship traffic is expected to increase.

Training an Artificial Neural Network to Listen for Whales

NOAA researchers handed over about 1,800 hours of annotated underwater recordings that contained the calls of different whales. Engineers at Google used the recordings to train an artificial neural network to identify and classify unknown sounds. The neural network is similar to the one Google created to identify different objects in images (essentially, how Google learns what images to show you when you search for text).

To date, the neural network has successfully identified the type of whale, but cannot yet distinguish between a resident killer whale (who have a smaller, more local range) and a transient one. The two populations look nearly identical to the untrained eye and their ranges overlap. But, they are very different — they do not mate, compete for the same food, or interact socially, and they are genetically unique.

This whale detection model is the next project in a lineup of efforts with Google AI. Other examples include an artificial neural network to listen for and identify chainsaw sounds in the rainforest — a potential sign of illegal logging.

The Google AI team can use the artificial neural network, combined with an array of hydrophones (underwater microphones) that already exist across the Salish Sea. The hydrophones essentially “listen for” killer whales and capture the sound of their voices.

The neural network will determine if it is in distress, alert the Marine Mammal Unit at Fisheries and Oceans Canada of the whale’s whereabouts, who then decide if and how to assist the whale. Also, whale detection alerts are sent to the smartphone of the Department of Fisheries and Oceans (DFO) officials in Canada, who could ultimately alert large ships, asking them to slow down to reduce noise pollution and the chance of a fatality from a whale-ship collision, reports VentureBeat.

“One of the primary threats that the whales are facing are entanglement and just difficulty foraging due to vessel noise. So they can use this (alert system) to advise traffic or the vessels themselves that orcas are in this location, consider slowing down or consider a change of course,” Matt Harvey, software engineer with Google AI, told the CBC.

Slowing Down Is Hard to Do

But getting ships to slow down for whales is another story.

In 2015, NOAA launched a voluntary speed limit near California. They even created a program that offered incentives to shipping companies that slowed down to 10 knots (nautical miles per hour) in these whale regions. But most vessels didn’t comply. 54% of the shipping companies broke the speed limit — a number that remained steady for several years. In 2019, only 15 of about 74 shipping companies complied with the California slow-down request.

In 2017, researchers looked into the impact of a voluntary commercial vessel slow-down in shipping lanes in the Salish Sea. They conducted a 3-month trial, requesting captains of large ships to slow down to 11 knots. Analyzing AIS’s ship traffic data, they found that 350 of 951 (37%) ships slowed down to this target speed.

Researchers Are Optimistic about  AI

The ocean is one area of research that could benefit from massive amounts of data. The sheer volume of the ocean that is untested, undocumented, means that so much is still unknown, even with endless sampling.

Artificial intelligence can fill in the gaps. For example, Google AI is also working on an artificial neural network to detect humpback whales. The New York Times reports that Kakani Katija, an engineer at the Monterey Bay Research Aquarium Institute, is using AI and lasers to study giant larvaceans — actually tiny ocean creatures that look like tadpoles but build homes that have the structural integrity of a loogie. Larvaceans carry the homes like a turtle carries a shell. But the homes dissolve easily, which makes the sea creature difficult to capture and study in a lab. Katja’s solution is to use lasers and AI for her research.

“What I love about technology or the progress we’re seeing in AI, I think it’s a hopeful time because if we get this right, I think it will have profound effects on how we observe our environment and create a sustainable future,” Katija told the New York Times.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Related
OpenAI’s GPT-4 outperforms doctors in another new study
OpenAI’s most powerful AI model, GPT-4, outperformed junior doctors in deciding how to treat patients with eye problems.
Watch the first AI vs. human dogfight using military jets
An AI fighter pilot faced off against a human pilot in a “dogfight” using actual planes — a huge milestone in military automation.
AI can help predict whether a patient will respond to specific tuberculosis treatments
Instead of a one-size-fits-all treatment approach, AI could help personalize treatments for each patient to provide the best outcomes.
New AI music generator makes songs from text prompts
AI music generators — AIs that create new music based on users’ text prompts — are lowering the bar for music creation, for better or worse.
Meta’s AI assistant just got a major upgrade — here’s how you can use it
Meta has upgraded its AI assistant, Meta AI, with a new LLM (Llama 3) and made it available in far more places.
Up Next
Track Wildfires
Subscribe to Freethink for more great stories