Robots At School Of Insecurity

There is a “vocina” that we are intimate about not adventuring in dangerous situations: for some it speaks louder than for others, but for all it is the instinct that keeps us alive. Robots don’t have it.

The real world is full of obstacles and unexpected, and even if machines could doubt their abilities, they would become more cautious, reducing any failures.

Risk assessment. In a preliminary study on drones, a team of scientists from Canergie Mellon University worked to “instill in aircraft” a sense of insecurity and rudimentary form of introspection. The idea is to exploit artificial intelligence to refine, in drones, the ability to predict the outcomes of future actions.

Based on the images captured by the robot’s camera, an algorithm evaluates whether the next action will end in an accident. In practice, it “forecasts” the results of a future action, a capacity very different from the simple identification of obstacles: the one developed by the team is more a system of understanding the dangers, than of reaction to them.

It would allow, for example, to avoid weather conditions that could damage sensors, or too dark locations for the camera to function.

Promising results. In the first tests, the ability to “sure” allowed the drone to move for a kilometer between the trees of a forest, more than twice the distance covered by drones that did not have that algorithm in operation.

Canergie Mellon’s is not the only studio in this direction. Microsoft researchers are experimenting with the simultaneous operation of different algorithms involved in decision-making processes. And they’ve already developed an artificial intelligence system that can assess whether the human he’s “talking” needs more time than expected to process a response.

13 PHOTOS Photogallery Welcome to 2020! GO TO THE GALLERY

Leave a Reply

Your email address will not be published.

You May Also Like