Artificial intelligence is helping us understand the language of animals.
The technology can analyze hours of animal audio in a fraction of the time the same work would take for a human.
“If you’re manually trying to isolate these calls from audio files, it takes a really long time,” said Kevin Coffey, a professor at the University of Washington.
Coffey is also one of the creators of DeepSqueak, an A.I. program designed to pick up on high-pitched rat calls that human ears often miss.
“In rats, these calls are often related to positive or negative effect,” Coffey said. “They make certain calls in positive situations and other calls in negative situations.”
DeepSqueak’s technology relies on the visual waveforms associated with an audio file.
The A.I. scans the waves for any irregular patterns.
“It certainly works better with some things, like rodent calls or a whistle, than others,” Coffey said. “Not all vocalizations are so nice.”
Similar machine learning technologies are being deployed at the Woods Hole Oceanographic Institution off the coast of Massachusetts.
Researchers are using underwater microphones to catalog different species on coral reefs.
A single reef can be home to more than a hundred species.
“Sound travels really well in the ocean,” said Aran Mooney, an associate scientist at WHOI. “It’s a really good indicator of those different species. We can put one sensor out and cover most of the reef.”
The work is urgent, as climate change threatens to make some species extinct.
“It’s a way to catalog the animals that are more obvious, but we can also detect some of the more cryptic animals,” Mooney said. “That’s one of the goals of the A.I., is to get some of those rarer species. Humans are pretty good at pulling out the obvious. We want to pull out the rare stuff.”
The early success of this technology does not mean an “animal to English” translator is on the horizon.
And any technology that claims to translate your dog’s barks is probably a phony.
Domestic pets have a smaller range of sounds. Much of their communication comes through body language.
In order to fully understand a dog or cat, researchers believe we will need to use a hybrid A.I., which analyzes audio and video.
Some research is already underway.
“Things like DeepLabCut do pose estimation, where they try to automatically score animal behaviors based on individual frames of video,” Coffey said. “I want to merge those two very, very powerful techniques, so we have behavior and communication in one model and we can start getting a better idea of what the sounds mean.”