There is a fundamental reason why the path to the creation of autonomous robots and vehicles (and we mean fully autonomous ones ) is giving experts so many headaches: robots come to be ‘idiot savants’ capable of doing very well only one thing, but only that , while his artificial intelligence disappears when dealing with other situations. And even when carrying out your usual task outside of your ideal working conditions.
Unfortunately, the task of an autonomous vehicle, which moves through the real world and is forced to interact with other vehicles, with people, and animals (not to mention the weather conditions, etc) could not be more multifaceted and more lacking. of “ideal working conditions”. That is, they will not be able to show true autonomy until they have a minimum capacity to understand their environment. Nothing exaggerated: enough with what a baby of a few months old would have .
A human baby begins to develop, around two months of age, mental physical models that allow them to anticipate how things work around them (how to place on the ground, for example, an object that we had not seen before, in such a way form that maintains a vertical position); A few months later they are able to anticipate how a material will fall into the bucket depending on whether it is a fluid or a solid (sand or a stone).
“Even a five-month-old could understand. Bring me a five-month-old!”
Until now, we have been unable to replicate those physical models that we innately possess in the ‘smart’ machines we create, explains Avideh Zakhor, a Berkeley professor of computer vision. And that is the kind of problem that we will have to solve if we want to take new steps forward in the development of autonomous machines.
Achieving this will not only allow these vehicles to navigate the real world, but could even save computing power , according to Lochlainn Wilson, CEO of SE4, a Japanese company that designs robots with the intention that one day they could operate on Mars: Given the latency of sending data between Earth and the red planet , performing complex tasks autonomously is more important than ever.
The solution adopted by SE4 and other companies in the sector is to train their AIs with simulations using the closest thing we have been able to create so far to our innate models: video game graphics engines , or more complex versions of them. such as Bullets Physics, an open source creation by a Google engineer who makes a special effort to reproduce aspects of real physics, such as friction .
Of course, there is another aspect that must evolve at the same rate as the artificial understanding of physics: the perception of the physical world through computer vision .