Tools such as digital recorders, drones, and artificial intelligence are now helping us understand the sounds of nature in new ways and may even allow us to someday communicate with animals. Automated listening posts have been set up in ecosystems worldwide, from rainforests to the depths of the ocean. The resulting research from these listening posts is enabling humans to observe and analyze nature’s sounds beyond the limits of our own sensory capabilities.
Needless to say, these listening posts generate a lot of data that is virtually impossible to sort through manually. Therefore, researchers in the field of bioacoustics (the study of sound made by living organisms) and ecoacoustics (the study of sounds made by entire ecosystems) are utilizing artificial intelligence to sort through the recordings and find emerging patterns that may help us understand what animals are saying to each other. There are now databases full of whale songs and honey bee dances that Karen Baker, author of the new book The Sounds of Life: How Digital Technology Is Bringing Us Closer to the Worlds of Animals and Plants, believes could one day turn into “a zoological version of Google Translate.”
“Digital technologies, so often associated with our alienation from nature, are offering us an opportunity to listen to nonhumans in powerful ways, reviving our connection to the natural world,” writes Karen Bakker in her new book. “We can use artificial intelligence-enabled robots to speak animal languages and essentially breach the barrier of interspecies communication. Researchers are doing this in a very rudimentary way with honeybees and dolphins and to some extent with elephants.”
As one example, utilizing this AI technology has allowed a research team in Germany to encode honeybee signals into a robot they sent into the hive After using the AI technology to decode the bees’ behavior, the team used the honeybees’ waggle dance communication to tell the honeybees to stop moving and to give them directions to fly to a specific nectar source. The next stage of this research is to implant the robots into the honeybee hives so the hives accept the robots as members of their community from birth.
The results of this research have far-reaching implications and many possible applications. According to Bakker’s interview with Vox, in addition to the possibility of one day speaking with animals, this research could even be used to create a form of therapy for animals and plants alike, “One project that really excites me is the use of bioacoustics to create a form of music therapy for the environment. It turns out that some species, like fish and coral, will respond to sounds like the sounds of healthy reefs. And this could help us regenerate degraded ecosystems. That research is in its infancy. We don’t know how many species that could apply to, but it could be fantastic if we could actually begin using essentially bioacoustics-based music therapy as a way to help with ecosystem regeneration.”
Given that this could give researchers an unprecedented amount of control over animals and plants, Bakker discusses the level of responsibility this places on researchers in this field explaining,
“Now, this raises a very serious ethical question, because the ability to speak to other species sounds intriguing and fascinating, but it could be used either to create a deeper sense of kinship, or a sense of dominion and manipulative ability to domesticate wild species that we’ve never as humans been able to previously control. So these are the sorts of ethical questions that researchers are now starting to engage in. But the hope is that with these ethics in place, in the future, we — you and I, ordinary people — will have a lot more ability to tune into the sounds of nature, and to understand what we’re hearing. And I think what that does is create a real sense of awe and wonder and also a feeling of profound kinship. That’s where I hoped we would take these technologies.”