New AI Tech Allows Humans to Talk to Animals
Not long ago, the scientific community laughed at the idea that animals might have their own languages. Today, researchers around the globe are using cutting-edge technology to listen in on animal “conversations” and even communicate with them.
In her new book The Sounds of Life: How Digital Technology is Bringing Us Closer to the Worlds of Animals and Plants, University of British Columbia professor Karen Bakker outlines some of the most ground-breaking experiments in animal and plant communication.
“Digital technologies, so often associated with our alienation from nature, are offering us an opportunity to listen to nonhumans in powerful ways, reviving our connection to the natural world,” writes Bakker, a director at the UBC Institute for Resources, Environment, and Sustainability.
She points out that digital listening posts are now being used to continuously record the sounds of ecosystems around the planet, from rainforests to the bottom of the ocean. Developments in miniaturization have even enabled scientists to place microphones on tiny animals like honeybees.
“Combined, these digital devices function like a planetary-scale hearing aid: enabling humans to observe and study nature’s sounds beyond the limits of our sensory capabilities,” Bakker writes. The next step for many scientists is harnessing the power of artificial intelligence to sift through these sounds and enable robots to “speak animal languages and essentially breach the barrier of interspecies communication.”
She cites a team of researchers in Germany that have taught tiny robots how to do the honeybee waggle dance. Using these dancing machines, the scientists were able to command the honeybees to stop moving, and to communicate where to fly to collect a specific nectar. The researchers plan to experiment with implanting robots into the hives so that the honeybees accept them as members of their community.
Bakker also writes about bioacoustics scientist Katie Payne and her discoveries regarding elephant communication. Payne was the first to find that elephants make infrasound signals, sounds below the human hearing range. The vibrations of these signals allow elephants to send messages across long distances through soil and stones. Scientists have since found that elephants have different signals for “honeybee” and “human,” as well as distinguishable signals for “threatening human” versus “nonthreatening human.” If the power of AI could be harnessed to send messages to elephant herds, we might be able to help protect their dwindling populations without removing them from their natural habitats.
Coral reefs also get attention in Bakker’s book. “A healthy coral reef sounds a little bit like an underwater symphony,” she explains. “There are cracks and burbles and hisses and clicks from the reef and its inhabitants and even whales dozens of miles away. If you could hear in the ultrasonic, you might hear the coral itself.” With the use of AI, scientists might eventually be able to get coral to repopulate certain areas by broadcasting “healthy coral reef” sounds to coral larvae.
While the idea of someday having “a zoological version of Google Translate” sounds overwhelmingly positive, there is the fear is that unscrupulous humans might use the technology to control animal populations for their own gain. Bakker warns that the possibility of exploiting animals “raises a lot of alarm bells” and that our “newfound powers” should never be used “to assert our domination over animals and plants.”