Technology

Can artificial intelligence help us understand animals?

In the Pixar film High, a cartoon dog named Dug, wears some sort of magical collar that can translate its barks and whimpers into fluent human speech. Elsewhere in the real world, highly trained dogs can be taught to press buttons that generate human speech for simple commands like “outside,” “go,” and “play.” Humans have always been fascinated by the ability to communicate with the animals they share the world with, and more recently machine learning, with its increasingly advanced abilities to analyze human language, has emerged as a hopeful avenue for animal translation.

An article in the New York Times This week documented major efforts by five research groups looking at machine learning algorithms to analyze the calls of rodents, lemurs, whales, chickens, pigs, bats, cats and more.

Typically, artificial intelligence systems learn by training with tagged data (which can be provided over the internet or resources such as e-books). For human language models, this usually means giving computers a sentence, blocking certain words, and asking the program to fill in the blanks. There are now more creative strategies that seek to align language with brain activity.

But analyzing animal language is different from analyzing human language. Computer scientists must instruct software programs what to look for and how to organize the data. This process depends in large part not only on collecting a large number of voice recordings, but also on matching these voice recordings to the visual social behavior of animals. For example, one group studying Egyptian fruit bats also used video cameras to record the bats themselves to provide context for the calls. And the group studying whales plans to use video, audio, and tags that can record animal movements to decipher the syntax, semantics, and ultimately the meaning of what whales are communicating and why. Of course, several groups have also suggested testing their animal dictionaries by playing recordings to the animals and observing how they react.

Creating a Google Translate for animals was an ambitious project that had been in the works for half the last decade. Machine learning has also come a long way when it comes to detecting the presence of animals and, in some cases, even accurately identifying animals by phone call. (Cornell’s Merlin app is shockingly accurate at matching bird species to their calls.) And while this type of software has shown some success in identifying the basic vocabulary of certain animals based on the characteristics of their vocalizations (i.e. frequency or loudness) as well as matching calls to individuals, it is far from understanding all the intricate nuances of what animal language might embody.

[Related: With new tags, researchers can track sharks into the inky depths of the ocean’s Twilight Zone]

Many skeptics of this approach point to both the shortcomings of the current AI language models that allow one to truly understand the relationships between words and the objects they refer to in the real world, and the flaws in understanding the Scientists from animal societies in general. Human artificial intelligence language models rely on a computer to map the relationship between words and the contexts in which they might occur (where they are in a sentence and what they might refer to). But these models have their own flaws and can sometimes be a black box — researchers know what goes in and what comes out, but don’t fully understand how the algorithm arrives at the conclusion.

Another factor the researchers are considering is the fact that animal communication may not work at all like human communication, and the tendency to humanize it could skew the results. Due to physiological and behavioral differences, there can be unique elements in animal language.

In order not to know the data parameters in advance, there are suggestions for using self-supervised learning algorithms to analyze audio data, according to a report earlier this year Wall Street Journal, in which the computer tells researchers what patterns it sees in the data—patterns that could reveal connections that escape the human eye. Ultimately, it depends on the human goals for this type of research how far down the rabbit hole people will go to try to understand animal communication, and for that purpose getting familiar with the basics can be enough. For example, a translator that can reliably interpret whether animals we are often in close contact with are happy, sad, or in danger could be both useful and more practical to create.

#artificial #intelligence #understand #animals Source

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *