artificial intelligence

How to Use AI to Talk to Whales—and Save Life on Earth

For researchers and conservationists alike, the potential applications of machine learning are basically limitless. And Earth Species is not the only group working on decoding animal communication. Payne spent the last months of his life advising for Project CETI, a nonprofit that built a base in Dominica this year for the study of sperm whale communication. “Just imagine what would be possible if we understood what animals are saying to each other; what occupies their thoughts; what they love, fear, desire, avoid, hate, are intrigued by, and treasure,” he wrote in Time in June.

Many of the tools that Earth Species has developed so far offer more in the way of groundwork than immediate utility. Still, there’s a lot of optimism in this nascent field. With enough resources, several biologists told me, decoding is scientifically achievable. That’s only the beginning. The real hope is to bridge the gulf in understanding between an animal’s experience and ours, however vast—or narrow—that might be.

Ari Friedlaender has something that Earth Species needs: lots and lots of data. Friedlaender researches whale behavior at UC Santa Cruz. He got started as a tag guy: the person who balances at the edge of a boat as it chases a whale, holds out a long pole with a suction-cupped biologging tag attached to the end, and slaps the tag on a whale’s back as it rounds the surface. This is harder than it seems. Friedlaender proved himself adept—“I played sports in college,” he explains—and was soon traveling the seas on tagging expeditions.

The tags Friedlaender uses capture a remarkable amount of data. Each records not only GPS location, temperature, pressure, and sound, but also high-definition video and three-axis accelerometer data, the same tech that a Fitbit uses to count your steps or measure how deeply you’re sleeping. Taken together, the data illustrates, in cinematic detail, a day in the life of a whale: its every breath and every dive, its traverses through fields of sea nettles and jellyfish, its encounters with twirling sea lions.

Friedlaender shows me an animation he has made from one tag’s data. In it, a whale descends and loops through the water, traveling a multicolored three-dimensional course as if on an undersea Mario Kart track. Another animation depicts several whales blowing bubble nets, a feeding strategy in which they swim in circles around groups of fish, trap the fish in the center with a wall of bubbles, then lunge through, mouths gaping. Looking at the whales’ movements, I notice that while most of them have traced a neat spiral, one whale has produced a tangle of clumsy zigzags. “Probably a young animal,” Friedlaender says. “That one hasn’t figured things out yet.”

Friedlaender’s multifaceted data is especially useful for Earth Species because, as any biologist will tell you, animal communication isn’t purely verbal. It involves gestures and movement just as often as vocalizations. Diverse data sets get Earth Species closer to developing algorithms that can work across the full spectrum of the animal kingdom. The organization’s most recent work focuses on foundation models, the same kind of computation that powers generative AI like ChatGPT. Earlier this year, Earth Species published the first foundation model for animal communication. The model can already accurately sort beluga whale calls, and Earth Species plans to apply it to species as disparate as orangutans (who bellow), elephants (who send seismic rumbles through the ground), and jumping spiders (who vibrate their legs). Katie Zacarian, Earth Species’ CEO, describes the model this way: “Everything’s a nail, and it’s a hammer.”

Another application of Earth Species’ AI is generating animal calls, like an audio version of GPT. Raskin has made a few-second chirp of a chiffchaff bird. If this sounds like it’s getting ahead of decoding, it is—AI, as it turns out, is better at speaking than understanding. Earth Species is finding that the tools it is developing will likely have the ability to talk to animals even before they can decode. It may soon be possible, for example, to prompt an AI with a whup and have it continue a conversation in Humpback—without human observers knowing what either the machine or the whale is saying.

Share with your friends!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the latest technology news and updates

STRAIGHT TO YOUR INBOX

Thank you for subscribing.

Something went wrong.

x  Powerful Protection for WordPress, from Shield Security
This Site Is Protected By
Shield Security